tag:blogger.com,1999:blog-67557096430449471792025-05-28T17:12:45.885-07:00Android Developers BlogThe official Android Developers blog covering the latest news on app development tools, platform updates, training, and documentation for developers across every Android device.Birnahttp://www.blogger.com/profile/04044883525253664551[email protected]Blogger1828125tag:blogger.com,1999:blog-6755709643044947179.post-5704601145755021962025-05-20T15:00:00.000-07:002025-05-20T15:03:26.083-07:00Announcing Kotlin Multiplatform Shared Module Template <meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2UwvjbFzf_BidFrger9mzJQmJ61A9kJAU5ENv_32s55N6fh3GVFLtz47TAl1Ax3mWkk3ltsaDFubqlqDHfX6y0WYax4Je92Zlebv-qih3X68zHR1MD8xEkkK7cPXCexw69PLzkKPzEQ8NQPAHNjhc7TLSVRRasiTDa_sIPDn144dN9D5hT9A_XwlkJ-o/s1600/Op2_AndroidKoitlin_Multiplatform_SharedModule_Blogger.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi2UwvjbFzf_BidFrger9mzJQmJ61A9kJAU5ENv_32s55N6fh3GVFLtz47TAl1Ax3mWkk3ltsaDFubqlqDHfX6y0WYax4Je92Zlebv-qih3X68zHR1MD8xEkkK7cPXCexw69PLzkKPzEQ8NQPAHNjhc7TLSVRRasiTDa_sIPDn144dN9D5hT9A_XwlkJ-o/s1600/Op2_AndroidKoitlin_Multiplatform_SharedModule_Blogger.png" style="display: none;" />
<em>Posted by Ben Trengrove - Developer Relations Engineer, Matt Dyor - Product Manager</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQeI-Ng_f6_CGYh-v9eO-sy0P3PTXLcfl6GLOqSZoH9GJ6XWx0h0-9f23NYL8f2gV-TRGQzOtjKH5Jj8IUa70Gc_lgZWi_158AcAkYd98kBWGoW7necXpKtHQK8821bIgjY1m8hlso0kMTr3gw10liioR0c0YUY0tI9boFGOcvCKKZsJEm1C63UPKNEfE/s1600/Op2_AndroidKoitlin_Multiplatform_SharedModule_Hero_Blog.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgQeI-Ng_f6_CGYh-v9eO-sy0P3PTXLcfl6GLOqSZoH9GJ6XWx0h0-9f23NYL8f2gV-TRGQzOtjKH5Jj8IUa70Gc_lgZWi_158AcAkYd98kBWGoW7necXpKtHQK8821bIgjY1m8hlso0kMTr3gw10liioR0c0YUY0tI9boFGOcvCKKZsJEm1C63UPKNEfE/s1600/Op2_AndroidKoitlin_Multiplatform_SharedModule_Hero_Blog.png" /></a>
<p>To empower Android developers, we’re excited to announce Android Studio’s new Kotlin Multiplatform (KMP) Shared Module Template. This template was specifically designed to allow developers to use a single codebase and apply business logic across platforms. More specifically, developers will be able to add shared modules to existing Android apps and share the business logic across their Android and iOS applications.</p>
<p>This makes it easier for Android developers to craft, maintain, and most importantly, own the business logic. The <b>KMP Shared Module Template</b> is available within Android Studio when you create a new module within a project.</p>
<br />
<h2><span style="font-size: x-large;">A single code base for business logic</span></h2>
<p>Most developers have grown accustomed to maintaining different code bases, platform to platform. In the past, whenever there’s an update to the business logic, it must be carefully updated in each codebase. But with the KMP Shared Module Template:</p>
<ul><ul>
<li>Developers can write once and publish the business logic to wherever they need it.</li>
<li>Engineering teams can do more faster.</li>
<li>User experiences are more consistent across the entire audience, regardless of platform or form factor.</li>
<li>Releases are better coordinated and launched with fewer errors.</li>
</ul></ul>
<p>Customers and developer teams who adopt KMP Shared Module Templates should expect to achieve greater ROI from mobile teams who can turn their attention towards delighting their users more and worrying about inconsistent code less.</p>
<h2><span style="font-size: x-large;">KMP enthusiasm</span></h2>
<p>The Android developer community remains very excited about KMP, especially after Google I/O 2024 where Google announced official support for shared logic across Android and iOS. We have seen continued momentum and enthusiasm from the community. For example, there are now over 1,500 KMP libraries listed on JetBrains' <a href="https://klibs.io/" target="_blank">klibs.io</a>.</p>
<p>Our customers are excited because KMP has made Android developers more productive. Consistently, Android developers have said that they want solutions that allow them to share code more easily and they want tools which boost productivity. This is why we recommend KMP; KMP simultaneously delivers a great experience for Android users while boosting ROI for the app makers. The KMP Shared Module Template is the latest step towards a developer ecosystem where user experience is consistent and applications are updated seamlessly.</p>
<h2><span style="font-size: x-large;">Large scale KMP adoptions</span></h2>
<p>This KMP Shared Module Template is new, but KMP more broadly is a maturing technology with several large-scale migrations underway. In fact, KMP has matured enough to support mission critical applications at Google. Google Docs, for example, is now running KMP in production on iOS with runtime performance on par or better than before. Beyond Google, <a href="https://www.stoneco.com.br/en/" target="_blank">Stone’s</a> 130 mobile developers are sharing over 50% of their code, allowing existing mobile teams to ship features approximately 40% faster to both Android and iOS.</p>
<h2><span style="font-size: x-large;">KMP was designed for Android development</span></h2>
<p>As always, we've designed the Shared Module Template with the needs of Android developer teams in mind. Making the KMP Shared Module Template part of the native Android Studio experience allows developers to efficiently add a shared module to an existing Android application and immediately start building shared business logic that leverages several KMP-ready Jetpack libraries including Room, SQLite, and DataStore to name just a few.</p>
<h2><span style="font-size: x-large;">Come check it out at KotlinConf</span></h2>
<p>Releasing Android Studio’s KMP Shared Module Template marks a significant step toward empowering Android development teams to innovate faster, to efficiently manage business logic, and to build high-quality applications with greater confidence. It means that Android developers can be responsible for the code that drives the business logic for every app across Android and iOS. We’re excited to bring Shared Module Template to <a href="https://kotlinconf.com/" target="_blank"><b>KotlinConf in Copenhagen, May 21 - 23</b></a>.</p>
<br />
<h2><span style="font-size: x-large;">Get started with KMP Shared Module Template</span></h2>
<p>To get started, you'll need the latest edition of Android Studio. In your Android project, the Shared Module Template is available within Android Studio when you create a new module. Click on “File” then “New” then “New Module” and finally “Kotlin Multiplatform Shared Module” and you are ready to add a KMP Shared Module to your Android app.</p>
<p>We appreciate any feedback on things you like or features you would like to see. If you find a bug, please report the issue. Remember to also follow us on X, LinkedIn, Blog, or YouTube for more Android development updates!</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-6925809857629690672025-05-20T11:03:00.000-07:002025-05-20T12:03:41.962-07:0016 things to know for Android developers at Google I/O 2025<meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9MIW2UBiiU-RrzVtvnukfm2wzj2-OFAM75mAH2f3yZyoPoivSPB0ycuUgkSnTEhJo1EpkfZCOpEdGHQT8ICfD8qvoB2euLdesX7M5KxguVCM-2xgPsHXwLTPkB5txjhaoqI8VBwBB0InamA7idRMkP_0aUMq-sGyzXhHzR6gX_n_v0jxDJ-CDOIO3HrE/s1600/O25-BHero-Android-5-Meta.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh9MIW2UBiiU-RrzVtvnukfm2wzj2-OFAM75mAH2f3yZyoPoivSPB0ycuUgkSnTEhJo1EpkfZCOpEdGHQT8ICfD8qvoB2euLdesX7M5KxguVCM-2xgPsHXwLTPkB5txjhaoqI8VBwBB0InamA7idRMkP_0aUMq-sGyzXhHzR6gX_n_v0jxDJ-CDOIO3HrE/s1600/O25-BHero-Android-5-Meta.png">
<em>Posted by <a href="https://x.com/matthewmccull" target="_blank">Matthew McCullough</a> – VP of Product Management, Android Developer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMBMzX28kZHHT7TSOgMqoyOJpzfVBFX1TdkbxXSV0TJQJvbm0d905KKfolPW-PtTxAhWF_r_NjaUybrhcDOXntZSb1txCh5QgtN8minYtCmT_m8WecYRlAefmvwRJDdUgP-XGABgORRTEvLpNi5HYzaa6x6KRREoRql6X4W3L_BL6Dex75XmY4ybcjHGs/s1600/O25-BHero-Android-5.png" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjMBMzX28kZHHT7TSOgMqoyOJpzfVBFX1TdkbxXSV0TJQJvbm0d905KKfolPW-PtTxAhWF_r_NjaUybrhcDOXntZSb1txCh5QgtN8minYtCmT_m8WecYRlAefmvwRJDdUgP-XGABgORRTEvLpNi5HYzaa6x6KRREoRql6X4W3L_BL6Dex75XmY4ybcjHGs/s1600/O25-BHero-Android-5.png" data-original-width="100%" data-original-height="800" /></a>
<p>Today at <a href="https://io.google/2025/" target="_blank">Google I/O</a>, we announced the many ways we’re helping you build excellent, adaptive experiences, and helping you stay more productive through updates to our tooling that put AI at your fingertips and throughout your development lifecycle. Here’s a recap of 16 of our favorite announcements for Android developers; you can also <a href="https://android-developers.googleblog.com/2025/05/the-android-show-io-edition.html" target="_blank">see what was announced</a> last week in <a href="https://www.android.com/new-features-on-android/io-2025/" target="_blank">The Android Show: I/O Edition</a>. And stay tuned over the next two days as we <a href="https://io.google/2025/explore?focus_areas=Android" target="_blank">dive into all of the topics in more detail</a>!</p>
<h2><span style="font-size: x-large">Building AI into your Apps</span></h2>
<h3><span style="font-size: large">1: Building intelligent apps with Generative AI</span></h3>
<p>Generative AI enhances apps' experience by making them intelligent, personalized and agentic. This year, we <a href="https://android-developers.googleblog.com/2025/05/on-device-gen-ai-apis-ml-kit-gemini-nano.html" target="_blank">announced new ML Kit GenAI APIs using Gemini Nano</a> for common on-device tasks like summarization, proofreading, rewrite, and image description. We also <a href="https://io.google/2025/explore/technical-session-13" target="_blank">provided capabilities</a> for developers to harness more powerful models such as Gemini Pro, Gemini Flash, and Imagen via Firebase AI Logic for more complex use cases like image generation and processing extensive data across modalities, including bringing <a href="https://io.google/2025/explore/technical-session-2" target="_blank">AI to life in Android XR</a>, and a new AI sample app, <a href="https://android-developers.googleblog.com/2025/05/androidify-building-ai-driven-experiences-jetpack-compose-gemini-camerax.html" target="_blank">Androidify</a>, that showcases how these APIs can transform your selfies into unique Android robots! To start building intelligent experiences by leveraging these new capabilities, explore the <a href="http://d.android.com/ai" target="_blank">developer documentation</a>, <a href="http://github.com/android/ai-samples" target="_blank">sample apps</a>, and watch the <a href="https://io.google/2025/explore/technical-session-20" target="_blank">overview session</a> to choose the right solution for your app.</p>
<h2><span style="font-size: x-large">New experiences across devices</span></h2>
<h3><span style="font-size: large">2: One app, every screen: think adaptive and unlock 500 million screens</span></h3>
<p>Mobile Android apps form the foundation across phones, foldables, tablets and ChromeOS, and this year we’re helping you bring them to cars and XR and expanding usages with desktop windowing and connected displays. This expansion means tapping into an ecosystem of 500 million devices – a significant opportunity to engage more users when you <b>think adaptive</b>, <a href="https://android-developers.googleblog.com/2025/05/adaptiveapps-io25.html" target="_blank">building a single mobile app</a> that works across form factors. Resources, including <a href="https://developer.android.com/develop/ui/compose/build-adaptive-apps#compose_material_3_adaptive" target="_blank">Compose Layouts library</a> and <a href="http://goo.gle/nav3" target="_blank">Jetpack Navigation</a> updates, help make building these dynamic experiences easier than before. You can see how <a href="https://android-developers.googleblog.com/2025/05/peacock-optimizes-streaming-jetpack-compose.html" target="_blank">Peacock, NBCUniveral’s streaming service (available in the US) is building adaptively</a> to meet users where they are.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="ooRcQFMYzmA" width="100%" height="498" src="https://www.youtube.com/embed/ooRcQFMYzmA"></iframe><imgcaption><center><em><b>Disclaimer:</b> Peacock is available in the US only. This video will only be viewable to US viewers.</em></center></imgcaption><br/>
<h3><span style="font-size: large">3: Material 3 Expressive: design for intuition and emotion</span></h3>
<p>The new <a href="https://m3.material.io/blog/building-with-m3-expressive?utm_source=blog&utm_medium=motion&utm_campaign=IO25" target="_blank">Material 3 Expressive</a> update provides tools to enhance your product's appeal by harnessing emotional UX, making it more engaging, intuitive, and desirable for users. Check out the I/O talk to <a href="https://io.google/2025/explore/technical-session-24" target="_blank">learn more about expressive design</a> and how it inspires emotion, clearly guides users toward their goals, and offers a flexible and personalized experience.</p>
<br/>
<h3><span style="font-size: large">4: Smarter widgets, engaging live updates</span></h3>
<p>Measure the return on investment of your widgets (available soon) and easily create personalized widget previews with <a href="https://developer.android.com/jetpack/androidx/releases/glance#1.2.0-alpha01" target="_blank">Glance 1.2</a>. Promoted <a href="https://io.google/2025/explore/technical-session-53" target="_blank">Live Updates</a> notify users of important ongoing notifications and come with a new <a href="http://goo.gle/live-updates" target="_blank">Progress Style</a> standardized template.</p>
<br/>
<h3><span style="font-size: large">5: Enhanced Camera & Media: low light boost and battery savings</span></h3>
<p>This year's I/O introduces several camera and media enhancements. These include a software low light boost for improved photography in dim lighting and native PCM offload, allowing the DSP to handle more audio playback processing, thus conserving user battery. Explore our detailed sessions on <a href="https://io.google/2025/explore/technical-session-19" target="_blank">built-in effects within CameraX and Media3</a> for further information.</p>
<h3><span style="font-size: large">6: Build next-gen app experiences for Cars</span></h3>
<p>We're launching expanded opportunities for developers to build in-car experiences, including new Gemini integrations, support for more app categories like Games and Video, and enhanced capabilities for media and communication apps via the Car App Library and new APIs. Alongside updated <a href="https://developer.android.com/docs/quality-guidelines/car-app-quality" target="_blank">car app quality tiers</a> and simplified distribution, we'll soon be providing improved testing tools like <a href="https://developer.android.com/training/cars/testing/aaos-on-pixel" target="_blank">Android Automotive OS on Pixel Tablet</a> and Firebase Test Lab access to help you bring your innovative apps to cars. Learn more from our <a href="https://io.google/2025/explore/technical-session-18" target="_blank">technical session</a> and <a href="https://android-developers.googleblog.com/2025/05/android-for-cars-google-io-2025.html" target="_blank">blog post on new in-car app experiences</a>.</p>
<h3><span style="font-size: large">7: Build for Android XR's expanding ecosystem with Developer Preview 2 of the SDK</span></h3>
<p>We <a href="https://blog.google/products/android/android-xr/" target="_blank">announced Android XR</a> in December, and today at Google I/O we shared a bunch of updates coming to the platform including Developer Preview 2 of the Android XR SDK plus an expanding ecosystem of devices: in addition to the first Android XR headset, Samsung’s Project Moohan, you’ll also see more devices including a new portable Android XR device from our partners at XREAL. There’s lots more to cover for <a href="https://developer.android.com/develop/xr" target="_blank">Android XR</a>: Watch the <a href="https://io.google/2025/explore/technical-session-2" target="_blank">Compose and AI on Android XR session</a>, and the <a href="https://io.google/2025/explore/technical-session-22" target="_blank">Building differentiated apps for Android XR with 3D content session</a>, and learn more about <a href="https://android-developers.googleblog.com/2025/05/updates-to-android-xr-sdk-developer-preview.html" target="_blank">building for Android XR</a>.</p>
<br />
<h3><span style="font-size: large">8: Express yourself on Wear OS: meet Material Expressive on Wear OS 6</span></h3>
<p>This year we are launching Wear OS 6: the most powerful and expressive version of Wear OS. <a href="https://blog.google/products/android/material-3-expressive-android-wearos-launch/" target="_blank">Wear OS 6 features Material 3 Expressive</a>, a new UI design with personalized visuals and motion for user creativity, coming to Wear, Android, and Google apps later this year. Developers gain access to Material 3 Expressive on Wear OS by utilizing new Jetpack libraries: Wear Compose Material 3, which provides components for apps and Wear ProtoLayout Material 3 which provides components and layouts for tiles. Get started with <a href="https://android-developers.googleblog.com/2025/05/whats-new-in-wear-os-6.html" target="_blank">Material 3 libraries and other updates on Wear</a>.</p>
<br />
<h3><span style="font-size: large">9: Engage users on Google TV with excellent TV apps</span></h3>
<p>You can leverage more resources within Compose's core and Material libraries with the stable release of Compose for TV, empowering you to build excellent adaptive UIs across your apps. We're also thrilled to share <a href="https://android-developers.googleblog.com/2025/05/engage-users-google-tv-excellent-apps.html" target="_blank">exciting platform updates and developer tools designed to boost app engagement</a>, including bringing Gemini capabilities to TV in the fall, opening enrollment for our Video Discovery API, and more.</p>
<h2><span style="font-size: x-large">Developer productivity</span></h2>
<h3><span style="font-size: large">10: Build beautiful apps faster with Jetpack Compose</span></h3>
<p><a href="https://developer.android.com/compose" target="_blank">Compose</a> is our big bet for UI development. The latest stable BOM release provides the features, performance, stability, and libraries that you need to build beautiful adaptive apps faster, so you can focus on <a href="https://android-developers.googleblog.com/2025/05/whats-new-in-jetpack-compose.html" target="_blank">what makes your app valuable to users</a>.</p>
<br />
<h3><span style="font-size: large">11: Kotlin Multiplatform: new Shared Template lets you build across platforms, easily</span></h3>
<p>Kotlin Multiplatform (KMP) enables teams to reach new audiences across Android and iOS with less development time. We’ve released a new Android Studio <a href="https://developer.android.com/kotlin/multiplatform/migrate" target="_blank">KMP shared module template</a>, updated <a href="https://developer.android.com/kotlin/multiplatform" target="_blank">Jetpack libraries</a> and new codelabs (<a href="https://developer.android.com/codelabs/kmp-get-started" target="_blank">Getting started with Kotlin Multiplatform</a> and <a href="https://developer.android.com/codelabs/kmp-migrate-room" target="_blank">Migrating your Room database to KMP</a>) to help developers who are looking to get started with KMP. Shared module templates make it easier for developers to craft, maintain, and own the business logic. Read more on <a href="https://android-developers.googleblog.com/2025/05/android-kotlin-multiplatform-google-io-kotlinconf-2025.html" target="_blank">what's new in Android's Kotlin Multiplatform</a>.
<h3><span style="font-size: large">12: Gemini in Android Studio: AI Agents to help you work</span></h3>
<p><a href="https://developer.android.com/gemini-in-android" target="_blank">Gemini in Android Studio</a> is the AI-powered coding companion that makes Android developers more productive at every stage of the dev lifecycle. In March, we <a href="https://android-developers.googleblog.com/2025/03/multimodal-image-attachment-now-available-gemini-android-studio.html" target="_blank">introduced Image to Code</a> to bridge the gap between UX teams and software engineers by intelligently converting design mockups into <a href="https://www.youtube.com/watch?v=f_6mtRWJzuc" target="_blank">working Compose UI code</a>. And today, we previewed new agentic AI experiences, <a href="https://www.youtube.com/watch?v=mP1tlIKK0R4" target="_blank">Journeys</a> for Android Studio and <a href="https://www.youtube.com/watch?v=ubyPjBesW-8" target="_blank">Version Upgrade Agent</a>. These innovations make it easier to build and test code. You can read more about these updates in <a href="https://android-developers.googleblog.com/2025/05/google-io-2025-whats-new-in-android-development-tools.html" target="_blank">What’s new in Android development tools</a>.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="ubyPjBesW-8" width="100%" height="413" src="https://www.youtube.com/embed/ubyPjBesW-8"></iframe>
<h3><span style="font-size: large">13: Android Studio: smarter with Gemini</span></h3>
<p>In this latest release, we're empowering devs with AI-driven tools like <a href="https://developer.android.com/gemini-in-android" target="_blank">Gemini in Android Studio</a>, <a href="https://android-developers.googleblog.com/2025/03/multimodal-image-attachment-now-available-gemini-android-studio.html" target="_blank">streamlining UI creation</a>, <a href="https://www.youtube.com/watch?v=mP1tlIKK0R4" target="_blank">making testing easier</a>, and ensuring apps are future-proofed in our ever-evolving Android ecosystem. These innovations accelerate development cycles, improve app quality, and help you stay ahead in a dynamic mobile landscape. To take advantage, upgrade to <a href="https://developer.android.com/studio/preview" target="_blank">the latest Studio release</a>. You can read more about these innovations in <a href="https://android-developers.googleblog.com/2025/05/google-io-2025-whats-new-in-android-development-tools.html" target="_blank">What’s new in Android development tools</a>.</p>
<br />
<h2><span style="font-size: x-large">And the latest on driving business growth</span></h2>
<h3><span style="font-size: large">14: What’s new in Google Play</span></h3>
<p>Get ready for exciting updates from Play designed to boost your discovery, engagement and revenue! Learn how we’re continuing to become a content-rich destination with enhanced personalization and fresh ways to showcase your apps and content. Plus, explore powerful new subscription features designed to streamline checkout and reduce churn. Read <a href="https://android-developers.googleblog.com/2025/05/io-2025-whats-new-in-google-play.html" target="_blank">I/O 2025: What's new in Google Play</a> to learn more.</p>
<br />
<h3><span style="font-size: large">15: Start migrating to Play Games Services v2 today</span></h3>
<p><a href="https://developer.android.com/games/pgs/overview" target="_blank">Play Games Services (PGS)</a> connects over 2 billion gamer profiles on Play, powering cross-device gameplay, personalized gaming content and rewards for your players throughout the gaming journey. We are moving PGS v1 features to v2 with more advanced features and an easier integration path. Learn more about the <a href="https://io.google/2025/explore/technical-session-15" target="_blank">migration timeline and new features</a>.</p>
<h3><span style="font-size: large">16: And of course, Android 16</span></h3>
<p>We unpacked some of the latest features coming to users in <a href="https://developer.android.com/about/versions/16" target="_blank">Android 16</a>, which we’ve been <a href="https://android-developers.googleblog.com/search?q=Android+16" target="_blank">previewing with you</a> for the last few months. If you haven’t already, make sure to <a href="https://developer.android.com/about/versions/16/get" target="_blank">test your apps with the latest Beta of Android 16</a>. Android 16 includes Live Updates, professional media and camera features, desktop windowing and connected displays, major accessibility enhancements and much more.</p>
<h2><span style="font-size: x-large">Check out all of the Android and Play content at Google I/O</span></h2>
<p>This was just a preview of some of the cool updates for Android developers at Google I/O, but <a href="https://io.google/2025/" target="_blank">stay tuned to Google I/O</a> over the next two days as we dive into a range of Android developer topics in more detail. You can check out the <a href="https://io.google/2025/explore/pa-keynote-7" target="_blank">What’s New in Android</a> and the <a href="https://io.google/2025/explore?focus_areas=Android" target="_blank">full Android track of sessions</a>, and whether you’re joining in person or around the world, we can’t wait to engage with you!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-55837381064050271892025-05-20T11:02:00.000-07:002025-05-20T12:03:24.749-07:00What’s new in Wear OS 6 <meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeHGsTq83LxzI8LuR4L6YD8IR7NfVNuOdty-7Ctx7uMvxUsSlOaTjf49NA4Lru9VG9BSq275Xh_BLzkQ4zbmF08DMWlg3OMLKg61889Qd2_wabbCdOFpqsDXXeNdvnoCQjvc_fvlzhqPIx_h7psbVp-iNXmBnGLT4CDjIMWeNTDsdENEyTUFrL1nhHH5g/s1600/new-in-wear-os-6-google-io-2025%20%282%29.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjeHGsTq83LxzI8LuR4L6YD8IR7NfVNuOdty-7Ctx7uMvxUsSlOaTjf49NA4Lru9VG9BSq275Xh_BLzkQ4zbmF08DMWlg3OMLKg61889Qd2_wabbCdOFpqsDXXeNdvnoCQjvc_fvlzhqPIx_h7psbVp-iNXmBnGLT4CDjIMWeNTDsdENEyTUFrL1nhHH5g/s1600/new-in-wear-os-6-google-io-2025%20%282%29.png" style="display: none;" />
<em>Posted by <a href="https://www.linkedin.com/in/chiarachiappini/" target="_blank">Chiara Chiappini</a> – Developer Relations Engineer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP5d-DTtcDafKrQB-N_rY9u_SZxiuCpXFMxS5qG8EwGSpejAJcOUxAcumJzTqU_TQrYkk-9D9IWSB30Hw5JJSr_fDtU4RDW4b6bVqmeeEr-bmRAB4Q0zFSu_Yeu2AeIi_fbcCJskN7HxdSpxSwAxJw1PEO3LrUP0lcUFrZEc-gClxpXRih3PJuQ8N6WWY/s1600/new-in-wear-os-6-google-io-2025%20%281%29.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjP5d-DTtcDafKrQB-N_rY9u_SZxiuCpXFMxS5qG8EwGSpejAJcOUxAcumJzTqU_TQrYkk-9D9IWSB30Hw5JJSr_fDtU4RDW4b6bVqmeeEr-bmRAB4Q0zFSu_Yeu2AeIi_fbcCJskN7HxdSpxSwAxJw1PEO3LrUP0lcUFrZEc-gClxpXRih3PJuQ8N6WWY/s1600/new-in-wear-os-6-google-io-2025%20%281%29.png" /></a>
<p>This year, we’re excited to introduce Wear OS 6: the most power-efficient and expressive version of Wear OS yet.</p>
<p>Wear OS 6 introduces the new design system we call <a href="http://blog.google/products/android/material-3-expressive-android-wearos-launch" target="_blank">Material 3 Expressive</a>. It features a major refresh with visual and motion components designed to give users an experience with more personalization. The new design offers a great level of expression to meet user demand for experiences that are modern, relevant, and distinct. Material 3 Expressive is coming to Wear OS, Android, and all your favorite Google apps on these devices later this year.</p>
<p>The good news is that you don’t need to compromise battery for beauty: thanks to Wear OS platform optimizations, watches updating from Wear OS 5 to Wear OS 6 can see up to 10% improvement in battery life.<sup>1</sup></p>
<h2><span style="font-size: x-large;">Wear OS 6 developer preview</span></h2>
<p>Today we’re releasing the Developer Preview of Wear OS 6, the next version of Google’s smartwatch platform, based on Android 16.</p>
<p>Wear OS 6 brings a number of developer-facing changes, such as refining the always-on display experience. <a href="https://developer.android.com/training/wearables/versions/6/changes" target="_blank">Check out what’s changed</a> and <a href="https://developer.android.com/training/wearables/versions/6/emulator" target="_blank">try the new Wear OS 6 emulator</a> to test your app for compatibility with the new platform version.</p>
<h2><span style="font-size: x-large;">Material 3 Expressive on Wear OS</span></h2>
<br />
<p>Material 3 Expressive for the watch is fully optimized for the round display. We recommend developers embrace the new design system in their apps and tiles. To help you adopt Material 3 Expressive in your app, we have begun releasing new <a href="https://developer.android.com/design/ui/wear/guides/get-started" target="_blank">design guidance</a> for Wear OS, along with corresponding <a href="https://developer.android.com/design/ui/wear/guides/foundations/download" target="_blank">Figma design kits</a>.</p>
<p>As a developer, you can get access the Material 3 Expressive on Wear OS using new Jetpack libraries:</p>
<ul><ul>
<li><a href="https://developer.android.com/jetpack/androidx/releases/wear-compose#1.5.0-beta01" target="_blank">Wear Compose Material 3</a> that provides components for apps.</li>
<li><a href="https://developer.android.com/jetpack/androidx/releases/wear-protolayout#1.3.0-beta02" target="_blank">Wear ProtoLayout Material 3</a> that provides components and layouts for tiles.</li>
</ul></ul>
<p>These two libraries provide implementations for the components catalog that adheres to the Material 3 Expressive design language.</p>
<h3><span style="font-size: large;">Make it personal with richer color schemes using themes</span></h3>
<br />
<p>The Wear Compose Material 3 and Wear Protolayout Material 3 libraries provide updated and extended color schemes, typography, and shapes to bring both depth and variety to your designs. Additionally, your tiles now align with the system font by default (on Wear OS 6+ devices), offering a more cohesive experience on the watch.</p>
<p>Both libraries introduce <a href="https://m3.material.io/styles/color/dynamic/choosing-a-source" target="_blank">dynamic color theming</a>, which automatically generates a color theme for your app or tile to match the colors of the watch face of Pixel watches.</p>
<h3><span style="font-size: large;">Make it more glanceable with new tile components</span></h3>
<p>Tiles now support a new framework and a set of components that embrace the watch's circular form factor. These components make tiles more consistent and glanceable, so users can more easily take swift action on the information included in them.</p>
<p>We’ve introduced a 3-slot tile layout to improve visual consistency in the Tiles carousel. This layout includes a title slot, a main content slot, and a bottom slot, designed to work across a range of different screen sizes:</p>
<br />
<h3><span style="font-size: large;">Highlight user actions and key information with components optimized for round screen</span></h3>
<p>The new Wear OS Material 3 components automatically adapt to larger screen sizes, building on the <a href="https://developer.android.com/training/wearables/versions/5#whats-in-wear-os-5" target="_blank">Large Display support</a> added as part of Wear OS 5. Additionally, components such as Buttons and Lists support shape morphing on apps.</p>
<p>The following sections highlight some of the most exciting changes to these components.</p>
<h4><span style="font-size: medium;">Embrace the round screen with the Edge Hugging Button</span></h4>
<p>We introduced a new <a href="https://developer.android.com/reference/kotlin/androidx/wear/compose/material3/package-summary#EdgeButton%28kotlin.Function0,androidx.compose.ui.Modifier,androidx.wear.compose.material3.EdgeButtonSize,kotlin.Boolean,androidx.wear.compose.material3.ButtonColors,androidx.compose.foundation.BorderStroke,androidx.compose.foundation.interaction.MutableInteractionSource,kotlin.Function1%29" target="_blank">EdgeButton</a> for apps and tiles with an iconic design pattern that maximizes the space within the circular form factor, hugs the edge of the screen, and comes in 4 standard sizes.</p>
<br />
<h4><span style="font-size: medium;">Fluid navigation through lists using new indicators</span></h4>
<p>The new <span style="color: #0d904f; font-family: courier;">TransformingLazyColumn</span> from the Foundation library makes expressive motion easy with motion that fluidly traces the edges of the display. Developers can customize the collapsing behavior of the list when scrolling to the top, bottom and both sides of the screen. For example, components like <span style="color: #0d904f; font-family: courier;">Cards</span> can scale down as they are closer to the top of the screen.</p>
<br />
<p>Material 3 Expressive also includes a <span style="color: #0d904f; font-family: courier;">ScrollIndicator</span> that features a new visual and motion design to make it easier for users to visualize their progress through a list. The <span style="color: #0d904f; font-family: courier;">ScrollIndicator</span> is displayed by default when you use a <span style="color: #0d904f; font-family: courier;">TransformingLazyColumn</span> and <span style="color: #0d904f; font-family: courier;">ScreenScaffold</span>.</p>
<br />
<p>Lastly, you can now use segments with the new ProgressIndicator, which is now available as a full-screen component for apps and as a small-size component for both apps and tiles.</p>
<br />
<p>To learn more about the new features and see the full list of updates, see the release notes of the latest beta release of the <a href="https://developer.android.com/jetpack/androidx/releases/wear-compose#1.5.0-beta01" target="_blank">Wear Compose</a> and <a href="https://developer.android.com/jetpack/androidx/releases/wear-protolayout#1.3.0-beta02" target="_blank">Wear Protolayout</a> libraries. Check out the migration guidance for <a href="https://developer.android.com/training/wearables/compose/migrate-to-material3" target="_blank">apps</a> and <a href="https://developer.android.com/training/wearables/tiles/versioning#migrate-tiles-expressive" target="_blank">tiles</a> on how to upgrade your existing apps, or try one of our <a href="https://developer.android.com/codelabs/compose-for-wear-os#0" target="_blank">codelabs</a> if you want to start developing using Material 3 Expressive design.</p>
<h2><span style="font-size: x-large;">Watch Faces</span></h2>
<p>With Wear OS 6 we are launching updates for watch face developers:</p>
<ul><ul>
<li>New options for customizing the appearance of your watch face using version 4 of Watch Face Format, such as animated state transitions from ambient to interactive and photo watch faces.</li>
</ul><ul>
<li>A new API for building watch face marketplaces.</li>
</ul></ul>
<p>Learn more about <a href="https://android-developers.googleblog.com/2025/05/whats-new-in-watch-faces.html" target="_blank">what's new in Watch Face updates</a>.</p>
<p>Look for more information about the general availability of Wear OS 6 later this year.</p>
<h2><span style="font-size: x-large;">Library updates</span></h2>
<h3><span style="font-size: large;">ProtoLayout</span></h3>
<p>Since our last major release, we've improved capabilities and the developer experience of the Tiles and ProtoLayout libraries to address feedback we received from developers. Some of these enhancements include:</p>
<ul><ul>
<li>New Kotlin-only <span style="font-family: courier;"><a href="https://developer.android.com/jetpack/androidx/releases/wear-protolayout#1.3.0-beta01" target="_blank">protolayout-material3</a></span> library adds support for enhanced visuals: Lottie animations (in addition to the <a href="https://developer.android.com/training/wearables/tiles/animations" target="_blank">existing animation capabilities</a>), more gradient types, and new arc line styles.</li>
<li>Developers can now write more idiomatic Kotlin, with APIs refined to better align with Jetpack Compose, including type-safe builders and an <a href="https://developer.android.com/reference/kotlin/androidx/wear/protolayout/modifiers/package-summary" target="_blank">improved modifier syntax</a>.</li>
</ul></ul>
<p>The example below shows how to display a layout with a text on a Tile using new enhancements:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// returns a LayoutElement for use in onTileRequest()</span>
materialScope(context, requestParams.deviceConfiguration) {
primaryLayout(
mainSlot = {
text(
text = <span style="color: #ba2121;">"Hello, World!"</span>.layoutString,
typography = BODY_LARGE,
)
}
)
}
</pre></div><br />
<p>For more information, see the <a href="https://developer.android.com/training/wearables/tiles/versioning#migrate-tiles-expressive" target="_blank">migration instructions</a>.</p>
<h2><span style="font-size: x-large;">Credential Manager for Wear OS</span></h2>
<p>The <a href="https://developer.android.com/identity/sign-in/credential-manager" target="_blank">CredentialManager API</a> is now available on Wear OS, starting with Google Pixel Watch devices running Wear OS 5.1. It introduces passkeys to Wear OS with a platform-standard authentication UI that is consistent with the experience on mobile.</p>
<p>The Credential Manager Jetpack library provides developers with a unified API that simplifies and centralizes their authentication implementation. Developers with an existing implementation on another form factor can use the same CredentialManager code, and most of the same supporting code to fulfill their Wear OS authentication workflow.</p>
<p>Credential Manager provides integration points for passkeys, passwords, and Sign in With Google, while also allowing you to keep your other authentication solutions as backups.</p>
<p>Users will benefit from a consistent, platform-standard authentication UI; the introduction of passkeys and other passwordless authentication methods, and the ability to authenticate without their phone nearby.</p>
<p>Check out the <a href="https://developer.android.com/training/wearables/apps/auth-wear" target="_blank">Authentication on Wear OS guidance</a> to learn more.
</p><h2><span style="font-size: x-large;">Richer Wear Media Controls</span></h2>
<br />
<p>Devices that run Wear OS 5.1 or later support enhanced media controls. Users who listen to media content on phones and watches can now benefit from the following new media control features on their watch:</p>
<ul><ul>
<li>They can fast-forward and rewind while listening to podcasts.</li>
<li>They can access the playlist and controls such as shuffle, like, and repeat through a new menu.</li>
</ul></ul>
<p>Developers with an existing implementation of <a href="https://developer.android.com/media/implement/surfaces/mobile#config-action-buttons" target="_blank">action buttons</a> and <a href="https://developer.android.com/media/media3/session/control-playback#modify-playlist" target="_blank">playlist</a> can benefit from this feature without additional effort. Check out how users will get more controls from your media app on a Google Pixel Watch device.</p>
<h2><span style="font-size: x-large;">Start building for Wear OS 6 now</span></h2>
<p>With these updates, there’s never been a better time to develop an app on Wear OS. These technical resources are a great place to learn more how to get started:</p>
<ul><ul>
<li><a href="https://developer.android.com/wear" target="_blank">Learn about designing and developing for Wear OS</a></li>
<li><a href="https://developer.android.com/codelabs/compose-for-wear-os#4" target="_blank">Take the Compose for Wear OS codelab</a></li>
<li><a href="https://github.com/android/wear-os-samples" target="_blank">Check out Wear OS samples on Github</a></li>
<li><a href="https://developer.android.com/training/wearables/versions/6/emulator" target="_blank">Get started with the latest Wear OS 6 emulator</a></li>
</ul></ul>
<p>Earlier this year, we expanded our smartwatch offerings with <a href="https://android-developers.googleblog.com/2025/01/build-kids-app-experiences-for-wear-os.html" target="_blank">Galaxy Watch for Kids</a>, a unique, phone-free experience designed specifically for children. This launch gives families a new way to stay connected, allowing children to explore Wear OS independently with a dedicated smartwatch. Consult our <a href="https://developer.android.com/training/wearables/kids/develop" target="_blank">developer guidance</a> to create a Wear OS app for kids.</p>
<p>We’re looking forward to seeing the experiences that you build on Wear OS!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />
<p><i><small><sup>1</sup> Actual battery performance varies.</small></i></p>
<p></p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-16929325109893769732025-05-20T11:01:00.000-07:002025-05-20T12:02:47.192-07:00What’s new in Watch Faces<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirC-HW9M-XO6PNfCB1lyyhuvOBFr1SiGMc2rxIIr7k2wd-E86U4_YBbRNfaXG2w2Tu8xgJ0xhrD_UXRK1uIRw8EyY-ldztQVSf4EnCJL2l3Rvj9lhVU9siNOzcpwLJmKDvjueoCya42LGR7_qig0fgf77n8q7Ck6ZCQ96ct0mk6O5Y5WgiFmG5Y6T68NA/s1600/new-watch-faces-google-io-meta.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEirC-HW9M-XO6PNfCB1lyyhuvOBFr1SiGMc2rxIIr7k2wd-E86U4_YBbRNfaXG2w2Tu8xgJ0xhrD_UXRK1uIRw8EyY-ldztQVSf4EnCJL2l3Rvj9lhVU9siNOzcpwLJmKDvjueoCya42LGR7_qig0fgf77n8q7Ck6ZCQ96ct0mk6O5Y5WgiFmG5Y6T68NA/s1600/new-watch-faces-google-io-meta.png" style="display: none;" />
<em>Posted by Garan Jenkin – Developer Relations Engineer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi28VnjatnNxirnQXdiRTZmI54gtyzRQsvpwvRlU7FfVHiKEwJU13alqfiJMQAU5jJXrZUAv5jzs5wOYb7QYLpy_iekmqL-l74IAzZ0GtHhaVZ2zUhapb8jVGvWgEt5f7et1qteUIsO7Ou4vaQ0N9q7BfVtie30bGq2onZubnDNyeE28q2vRKuLhLH6Qrg/s1600/new-in-watch-faces-google-io-hero.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi28VnjatnNxirnQXdiRTZmI54gtyzRQsvpwvRlU7FfVHiKEwJU13alqfiJMQAU5jJXrZUAv5jzs5wOYb7QYLpy_iekmqL-l74IAzZ0GtHhaVZ2zUhapb8jVGvWgEt5f7et1qteUIsO7Ou4vaQ0N9q7BfVtie30bGq2onZubnDNyeE28q2vRKuLhLH6Qrg/s1600/new-in-watch-faces-google-io-hero.png" /></a>
<p>Wear OS has a thriving watch face ecosystem featuring a variety of designs that also aims to minimize battery impact. Developers have embraced the simplicity of creating watch faces using <a href="https://developer.android.com/training/wearables/wff" target="_blank">Watch Face Format</a> – in the last year, the number of published watch faces <b>using Watch Face Format has grown by over 180%<sup>*</sup></b>.</p>
<p>Today, we’re continuing our investment and announcing version 4 of the Watch Face Format, available as part of Wear OS 6. These updates allow developers to express even greater levels of creativity through the new features we’ve added. And we’re supporting marketplaces, which gives flexibility and control to developers and more choice for users.</p>
<p>In this blog post we'll cover key new features, check out the <a href="https://developer.android.com/training/wearables/wff/features#v3" target="_blank">documentation</a> for more details of changes introduced in recent versions.</p>
<h2><span style="font-size: x-large;">Supporting marketplaces with Watch Face Push</span></h2>
<p>We’re also announcing a completely new API, the <a href="https://developer.android.com/training/wearables/watch-face-push" target="_blank">Watch Face Push API</a>, aimed at developers who want to create their own watch face marketplaces.</p>
<p>Watch Face Push, available on devices running Wear OS 6 and above, works exclusively with watch faces that use the Watch Face Format watch faces.</p>
<p>We’ve partnered with well-known watch face developers – including <b><a href="https://www.facer.io/" target="_blank">Facer</a></b>, <b><a href="https://timeflik.com/" target="_blank">TIMEFLIK</a></b>, <b><a href="https://getwatchmaker.com/" target="_blank">WatchMaker</a></b>, <b><a href="https://pujie.io/" target="_blank">Pujie</a></b>, and <b><a href="https://www.recreative-watch.com/" target="_blank">Recreative</a></b> – in designing this new API. We’re excited that all of these developers will be bringing their unique watch face experiences to Wear OS 6 using Watch Face Push.</p>
<br />
<p>Watch faces managed and deployed using Watch Face Push are all written using Watch Face Format. Developers publish these watch faces in the same way as publishing through Google Play, though there are some additional checks the developer must make which are described in the Watch Face Push guidance.</p>
<br />
<p>The Watch Face Push API covers only the watch part of this typical marketplace system diagram - as the app developer, you have control and responsibility for the phone app and cloud components, as well as for building the Wear OS app using Watch Face Push. You’re also in control of the phone-watch communications, for which we recommend using the <a href="https://developer.android.com/training/wearables/data/messages" target="_blank">Data Layer APIs</a>.</p>
<h2><span style="font-size: x-large;">Adding Watch Face Push to your project</span></h2>
<p>To start using Watch Face Push on Wear OS 6, include the following dependency in your Wear OS app:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// Ensure latest version is used by checking the repository</span>
implementation(<span style="color: #ba2121;">"androidx.wear.watchface:watchface-push:1.3.0-alpha07"</span>)
</pre></div><br />
<p>Declare the necessary permission in your <span style="color: #0d904f; font-family: courier;">AndroidManifest.xml:</span></p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><uses-permission android:name=<span style="color: #ba2121;">"com.google.wear.permission.PUSH_WATCH_FACES"</span> />
</pre></div><br />
<p>Obtain a Watch Face Push client:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> manager = WatchFacePushManagerFactory.createWatchFacePushManager(context)
</pre></div><br />
<p>You’re now ready to start using the Watch Face Push API, for example to list the watch faces you have already installed, or add a new watch face:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// List existing watch faces, installed by this app</span>
<span style="color: green; font-weight: bold;">val</span> listResponse = manager.listWatchFaces()
<span style="color: #408080; font-style: italic;">// Add a watch face</span>
manager.addWatchFace(watchFaceFileDescriptor, validationToken)
</pre></div><br />
<h2><span style="font-size: x-large;">Understanding Watch Face Push</span></h2>
<p>While the basics of the Watch Face Push API are easy to understand and access through the <span style="color: #0d904f; font-family: courier;">WatchFacePushManager</span> interface, it’s important to consider several other factors when working with the API in practice to build an effective marketplace app, including:</p>
<ul><ul>
<li><b>How to build watch faces for use with Watch Face Push</b> - Watch faces deployed using Watch Face Push require an additional validation step to be performed by the developer. Learn more about <a href="https://developer.android.com/training/wearables/watch-face-push" target="_blank">how to build watch faces for use with Watch Face Push, and to integrate Watch Face Push into your application</a>.</li></ul><ul>
<li><b>Watch Face Slots</b> - Each Watch Face Push-based application is able to install a limited number of watch faces at any given time, represented by a Slot. <a href="https://developer.android.com/training/wearables/watch-face-push#slots" target="_blank">Learn more about </a><a href="https://developer.android.com/training/wearables/watch-face-push#default-watch-face" target="_blank">how to work with and manage slots</a>.</li></ul><ul>
<li><b>Default watch faces</b> - The API allows for a default watch face to be installed when the app is installed. Learn more about <a href="https://developer.android.com/training/wearables/watch-face-push#default-watch-face" target="_blank">how to build and include this default watch face</a>.</li></ul><ul>
<li><b>Setting active watch faces</b> - Through an additional permission, the app can set the active watch face. Learn about <a href="https://developer.android.com/training/wearables/watch-face-push#set-active-watchface" target="_blank">how to integrate this feature</a>, as well as how to handle the different permission scenarios.</li>
</ul></ul>
<p>To learn more about using Watch Face Push, see the <a href="https://developer.android.com/training/wearables/watch-face-push" target="_blank">guidance</a> and <a href="https://developer.android.com/reference/kotlin/androidx/wear/watchface/push/package-summary" target="_blank">reference</a> documentation.</p>
<h2><span style="font-size: x-large;">Updates to Watch Face Format</span></h2>
<h3><span style="font-size: large;">Photos</span></h3>
<i>Available from Watch Face Format v4</i>
<p>The new <span style="color: #0d904f; font-family: courier;">Photos</span> element allows the watch face to contain user-selectable photos. The element supports both individual photos and a gallery of photos. For a gallery of photos, developers can choose whether the photos advance automatically or when the user taps the watch face.</p>
<br />
<p>The user is able to select the photos of their choice through the companion app, making this a great way to include true personalization in your watch face. To use this feature, first add the necessary configuration:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><UserConfigurations>
<PhotosConfiguration id=<span style="color: #ba2121;">"myPhoto"</span> configType=<span style="color: #ba2121;">"SINGLE"</span>/>
</UserConfigurations>
</pre></div><br />
<p>Then use the <span style="color: #0d904f; font-family: courier;">Photos</span> element within any <span style="color: #0d904f; font-family: courier;">PartImage</span>, in the same way as you would for an <span style="color: #0d904f; font-family: courier;">Image element</span>:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><PartImage ...>
<Photos source=<span style="color: #ba2121;">"[CONFIGURATION.myPhoto]"</span>
defaultImageResource=<span style="color: #ba2121;">"placeholder_photo"</span>/>
</PartImage>
</pre></div><br />
<p>For details on how to support multiple photos, and how to configure the different change behaviors, refer to the Photos section of the <a href="https://developer.android.com/training/wearables/wff/personalization/photos" target="_blank">guidance</a> and <a href="https://developer.android.com/reference/wear-os/wff/group/part/image/photos" target="_blank">reference</a>, as well as the <a href="https://github.com/android/wear-os-samples/" target="_blank">GitHub samples</a>.</p>
<h2><span style="font-size: x-large;">Transitions</span></h2>
<i>Available from Watch Face Format v4</i>
<p>Watch Face Format now supports transitions when exiting and entering ambient mode.</p>
<br />
<p>This is achieved through the existing <span style="color: #0d904f; font-family: courier;">Variant</span> tag. For example, the hours and minutes in the above watch face are animated as follows:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><DigitalClock ...>
<Variant mode=<span style="color: #BA2121">"AMBIENT"</span> target=<span style="color: #BA2121">"x"</span> value=<span style="color: #BA2121">"100"</span> interpolation=<span style="color: #BA2121">"OVERSHOOT"</span> />
<!-- Rest of <span style="color: #BA2121">"hh:mm"</span> clock definition here -->
</DigitalClock>
</pre></div>
<p>By default, the animation takes the full extent of allowed time for the transition. The new <span style="color: #0d904f; font-family: courier;">interpolation</span> attribute controls the animation effect - in this case the use of <span style="color: #0d904f; font-family: courier;">OVERSHOOT</span> adds a playful experience.</p>
<p>The seconds are implemented in a separate <span style="color: #0d904f; font-family: courier;">DigitalClock</span> element, which shows the use of the new <span style="color: #0d904f; font-family: courier;">duration</span> attribute:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><DigitalClock ...>
<Variant mode=<span style="color: #ba2121;">"AMBIENT"</span> target=<span style="color: #ba2121;">"alpha"</span> value=<span style="color: #ba2121;">"0"</span> duration=<span style="color: #ba2121;">"0.5"</span>/>
<!-- Rest of <span style="color: #ba2121;">"ss"</span> clock definition here -->
</DigitalClock>
</pre></div><br />
<p>The <span style="color: #0d904f; font-family: courier;">duration</span> attribute takes a value between <span style="color: #0d904f; font-family: courier;">0.0</span> and <span style="color: #0d904f; font-family: courier;">1.0</span>, with <span style="color: #0d904f; font-family: courier;">1.0</span> representing the full extent of the allowed time. In this example, by using a value of <span style="color: #0d904f; font-family: courier;">0.5</span>, the seconds animation is quicker - taking half the allowed time, in comparison to the hours and minutes, which take the entire transition period.</p>
<p>For more details on using transitions, see the <a href="https://developer.android.com/training/wearables/wff/transform" target="_blank">guidance documentation</a>, as well as the reference documentation for <span style="font-family: courier;">Variant</span>.</p>
<h2><span style="font-size: x-large;">Color Transforms</span></h2>
<i>Available from Watch Face Format v4</i>
<p>We’ve extended the usefulness of the <span style="color: #0d904f; font-family: courier;">Transform</span> element by allowing <span style="color: #0d904f; font-family: courier;">color</span> to be transformed on the majority of elements where it is an attribute, and also allowing <span style="color: #0d904f; font-family: courier;">tintColor</span> to be transformed on <span style="color: #0d904f; font-family: courier;">Group</span> and <span style="color: #0d904f; font-family: courier;">Part<sup>*</sup></span> elements such as <span style="color: #0d904f; font-family: courier;">PartDraw</span> and <span style="color: #0d904f; font-family: courier;">PartText</span>.</p>
<p>The main exceptions to this addition are the clock elements, <span style="color: #0d904f; font-family: courier;">DigitalClock</span> and <span style="color: #0d904f; font-family: courier;">AnalogClock</span>, and also <span style="color: #0d904f; font-family: courier;">ComplicationSlot</span>, which do not currently support <span style="color: #0d904f; font-family: courier;">Transform</span>.</p>
<p>In addition to extending the list of transformable attributes to include colors, we’ve also added a handful of useful functions for manipulating color:</p>
<ul><ul>
<li><a href="https://developer.android.com/reference/wear-os/wff/common/attributes/arithmetic-expression#functions" target="_blank">extractColorFromColors(colors, interpolate, value)</a></li>
<li><a href="https://developer.android.com/reference/wear-os/wff/common/attributes/arithmetic-expression#functions" target="_blank">extractColorFromWeightedColors(colors, weights, interpolate, value)</a></li>
<li><a href="https://developer.android.com/reference/wear-os/wff/common/attributes/arithmetic-expression#functions" target="_blank">colorArgb(alpha, red, green, blue)</a></li>
<li><a href="https://developer.android.com/reference/wear-os/wff/common/attributes/arithmetic-expression#functions" target="_blank">colorRgb(red, green, blue)</a></li>
</ul></ul>
<p>To see these in action, let’s consider an example.</p>
<p>The Weather data source provides the current UV index through <span style="color: #0d904f; font-family: courier;">[WEATHER.UV_INDEX]</span>. When representing the UV index, these values are <a href="https://en.wikipedia.org/wiki/Ultraviolet_index#Index_usage" target="_blank">typically also assigned a color</a>:</p>
<br />
<p>We want to represent this information as an <span style="color: #0d904f; font-family: courier;">Arc</span>, not only showing the value, but also using the appropriate color. We can achieve this as follows:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><Arc centerX=<span style="color: #ba2121;">"0"</span> centerY=<span style="color: #ba2121;">"0"</span> height=<span style="color: #ba2121;">"420"</span> width=<span style="color: #ba2121;">"420"</span>
startAngle=<span style="color: #ba2121;">"165"</span> endAngle=<span style="color: #ba2121;">"165"</span> direction=<span style="color: #ba2121;">"COUNTER_CLOCKWISE"</span>>
<Transform target=<span style="color: #ba2121;">"endAngle"</span>
value=<span style="color: #ba2121;">"165 - 40 * (clamp(11, 0.0, 11.0) / 11.0)"</span> />
<Stroke thickness=<span style="color: #ba2121;">"20"</span> color=<span style="color: #ba2121;">"#ffffff"</span> cap=<span style="color: #ba2121;">"ROUND"</span>>
<Transform target=<span style="color: #ba2121;">"color"</span>
value=<span style="color: #ba2121;">"extractColorFromWeightedColors(#97d700 #FCE300 #ff8200 #f65058 #9461c9, 3 3 2 3 1, false, clamp([WEATHER.UV_INDEX] + 0.5, 0.0, 12.0) / 12.0)"</span> />
</Stroke>
</Arc>
</pre></div><br />
<p>Let’s break this down:</p>
<ul><ul>
<li>The first <span style="color: #0d904f; font-family: courier;">Transform</span> restricts the UV index to the range 0.0 to 11.0 and adjusts the sweep of the <span style="color: #0d904f; font-family: courier;">Arc</span> according to that value.</li></ul><ul>
<li>The second <span style="color: #0d904f; font-family: courier;">Transform</span> uses the new <span style="color: #0d904f; font-family: courier;">extractColorFromWeightedColors</span> function.</li>
<ul><ul>
<li>The <b>first</b> argument is our list of colors</li></ul><ul>
<li>The <b>second</b> argument is a list of weights - you can see from the chart above that green covers 3 values, whereas orange only covers 2, so we use weights to represent this.</li></ul><ul>
<li>The <b>third</b> argument is whether or not to interpolate the color values. In this case we want to stick strictly to the color convention for UV index, so this is false.</li></ul><ul>
<li>Finally in the <b>fourth</b> argument we coerce the UV value into the range <span style="color: #0d904f; font-family: courier;">0.0</span> to <span style="color: #0d904f; font-family: courier;">1.0</span>, which is used as an index into our weighted colors.</li>
</ul></ul></ul></ul>
<p>The result looks like this:</p>
<br />
<p>As well as being able to provide raw colors and weights to these functions, they can also be used with values from complications, such as HR, temperature or steps goal. For example, to use the color range specified in a goal complication:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><Transform target=<span style="color: #ba2121;">"color"</span>
value=<span style="color: #ba2121;">"extractColorFromColors(</span>
<span style="color: #7d9029;"> [COMPLICATION.GOAL_PROGRESS_COLORS]</span>,
<span style="color: #7d9029;"> [COMPLICATION.GOAL_PROGRESS_COLOR_INTERPOLATE]</span>,
<span style="color: #7d9029;"> [COMPLICATION.GOAL_PROGRESS_VALUE]</span> /
<span style="color: #7d9029;"> [COMPLICATION.GOAL_PROGRESS_TARGET_VALUE]</span>
)<span style="color: #ba2121;">"/></span>
</pre></div><br />
<h2><span style="font-size: x-large;">Introducing the <span style="color: #0d904f; font-family: courier;">Reference</span> element</span></h2>
<i>Available from Watch Face Format v4</i>
<p>The new <span style="color: #0d904f; font-family: courier;">Reference</span> element allows you to refer to any transformable attribute from one part of your watch face scene in other parts of the scene tree.</p>
<p>In our UV index example above, we’d also like the text labels to use the same color scheme.</p>
<p>We could perform the same color transform calculation as on our <span style="color: #0d904f; font-family: courier;">Arc</span>, using <span style="color: #0d904f; font-family: courier;">[WEATHER.UV_INDEX]</span>, but this is duplicative work which could lead to inconsistencies, for example if we change the exact color hues in one place but not the other.</p>
<p>Returning to the <span style="color: #0d904f; font-family: courier;">Arc</span> definition, let’s create a <span style="color: #0d904f; font-family: courier;">Reference</span> to the color:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><Arc centerX=<span style="color: #ba2121;">"0"</span> centerY=<span style="color: #ba2121;">"0"</span> height=<span style="color: #ba2121;">"420"</span> width=<span style="color: #ba2121;">"420"</span>
startAngle=<span style="color: #ba2121;">"165"</span> endAngle=<span style="color: #ba2121;">"165"</span> direction=<span style="color: #ba2121;">"COUNTER_CLOCKWISE"</span>>
<Transform target=<span style="color: #ba2121;">"endAngle"</span>
value=<span style="color: #ba2121;">"165 - 40 * (clamp(11, 0.0, 11.0) / 11.0)"</span> />
<Stroke thickness=<span style="color: #ba2121;">"20"</span> color=<span style="color: #ba2121;">"#ffffff"</span> cap=<span style="color: #ba2121;">"ROUND"</span>>
<Reference source=<span style="color: #ba2121;">"color"</span> name=<span style="color: #ba2121;">"uv_color"</span> defaultValue=<span style="color: #ba2121;">"#ffffff"</span> />
<Transform target=<span style="color: #ba2121;">"color"</span>
value=<span style="color: #ba2121;">"extractColorFromWeightedColors(#97d700 #FCE300 #ff8200 #f65058 #9461c9, 3 3 2 3 1, false, clamp([WEATHER.UV_INDEX] + 0.5, 0.0, 12.0) / 12.0)"</span> />
</Stroke>
</Arc>
</pre></div><br />
<p>The color of the <span style="color: #0d904f; font-family: courier;">Arc</span> is calculated from the relatively complex <span style="color: #0d904f; font-family: courier;">extractColorFromWeightedColors</span> function. To avoid repeating this elsewhere in our watch face, we have added a <span style="color: #0d904f; font-family: courier;">Reference</span> element, which takes as its source the <span style="color: #0d904f; font-family: courier;">Stroke</span> color.</p>
<p>Let’s now look at how we can consume this value in a <span style="color: #0d904f; font-family: courier;">PartText</span> elsewhere in the watch face. We gave the <span style="color: #0d904f; font-family: courier;">Reference</span> the name <span style="color: #0d904f; font-family: courier;">uv_color</span>, so we can simply refer to this in any expression:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><PartText x=<span style="color: #ba2121;">"0"</span> y=<span style="color: #ba2121;">"225"</span> width=<span style="color: #ba2121;">"450"</span> height=<span style="color: #ba2121;">"225"</span>>
<TextCircular centerX=<span style="color: #ba2121;">"225"</span> centerY=<span style="color: #ba2121;">"0"</span> width=<span style="color: #ba2121;">"420"</span> height=<span style="color: #ba2121;">"420"</span>
startAngle=<span style="color: #ba2121;">"120"</span> endAngle=<span style="color: #ba2121;">"90"</span>
align=<span style="color: #ba2121;">"START"</span> direction=<span style="color: #ba2121;">"COUNTER_CLOCKWISE"</span>>
<Font family=<span style="color: #ba2121;">"SYNC_TO_DEVICE"</span> size=<span style="color: #ba2121;">"24"</span>>
<Transform target=<span style="color: #ba2121;">"color"</span> value=<span style="color: #ba2121;">"[REFERENCE.uv_color]"</span> />
<Template>%d<Parameter expression=<span style="color: #ba2121;">"[WEATHER.UV_INDEX]"</span> /></Template>
</Font>
</TextCircular>
</PartText>
<!-- Similar PartText here <span style="color: green; font-weight: bold;">for</span> the <span style="color: #ba2121;">"UV:"</span> label -->
</pre></div>
<p>As a result, the color of the Arc and the UV numeric value are now coordinated:</p>
<br />
<p>For more details on how to use the <span style="font-family: courier;"><a href="https://developer.android.com/reference/wear-os/wff/common/reference" target="_blank">Reference</a></span> element, refer to the <span style="font-family: courier;"><a href="https://developer.android.com/reference/wear-os/wff/common/reference/reference" target="_blank">Reference</a></span> guidance.</p>
<h2><span style="font-size: x-large;">Text autosizing</span></h2>
<i>Available from Watch Face Format v3</i>
<p>Sometimes the exact length of the text to be shown on the watch face can vary, and as a developer you want to balance being able to display text that is both legible, but also complete.</p>
<p>Auto-sizing text can help solve this problem, and can be enabled through the <span style="color: #0d904f; font-family: courier;">isAutoSize</span> attribute introduced to the <span style="color: #0d904f; font-family: courier;">Text</span> element:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><Text align=<span style="color: #ba2121;">"CENTER"</span> isAutoSize=<span style="color: #ba2121;">"true"</span>>
</pre></div><br />
<p>Having set this attribute, text will then automatically fit the available space, starting at the maximum size specified in your <span style="color: #0d904f; font-family: courier;">Font</span> element, and with a minimum size of 12.</p>
<p>As an example, step count could range from tens or hundreds through to many thousands, and the new <span style="color: #0d904f; font-family: courier;">isAutoSize</span> attribute enables best use of the available space for every possible value:</p>
<br />
<p>For more details on <span style="color: #0d904f; font-family: courier;">isAutoSize</span>, see the <a href="https://developer.android.com/reference/wear-os/wff/group/part/text/text?version=4" target="_blank"><span style="font-family: courier;">Text</span> reference</a>.</p>
<h2><span style="font-size: x-large;">Android Studio support</span></h2>
<p>For developers working in Android Studio, we’ve added support to make working with Watch Face Format easier, including:</p>
<ul><ul>
<li>Run configuration support</li>
<li>Auto-complete and resource reference</li>
<li>Lint checking</li>
</ul></ul>
<p>This is available from <a href="https://developer.android.com/studio/preview" target="_blank">Android Studio Canary version 2025.1.1 Canary 10</a>.</p>
<h2><span style="font-size: x-large;">Learn More</span></h2>
<p>To learn more about building watch faces, please take a look at the following resources:</p>
<ul><ul>
<li><a href="https://developer.android.com/training/wearables/wff" target="_blank">Watch Face Format guidance</a></li>
<li><a href="https://developer.android.com/reference/wear-os/wff/watch-face" target="_blank">Watch Face Format reference</a></li>
</ul></ul>
<p>We’ve also recently launched a <a href="https://developer.android.com/codelabs/watch-face-format" target="_blank">codelab for Watch Face Format</a> and have <a href="https://github.com/android/wear-os-samples/tree/main/WatchFaceFormat" target="_blank">updated samples</a> on GitHub to showcase new features. The <a href="https://issuetracker.google.com/issues/new?component=1112371" target="_blank">issue tracker</a> is available for providing feedback.</p>
<p>We're excited to see the watch face experiences that you create and share!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />
<i><small><sup>*</sup> Google Play data for period 2025-03-24 to 2025-03-23</small></i>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-76590489299186064112025-05-20T11:00:00.000-07:002025-05-20T12:37:48.714-07:00What's New in Jetpack Compose<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9fkaR659shSkWXUVdlrR6N6JnD_3c1cFOV-x4wkpSgF01807L3vICUfsA45R-A-B1r2AtHdwkUnC4XKpvD5G2p-FjTsF177qBpFyhBJtQ0Z7cZiPdxRZkeKZv00N_pJL3Tpom6Sdx49r4FZW79uc07ov3twERtgqPiYaBLg2AWI3sONZE4pCdPqIzSv0/s1600/new-in-jetpack-compose-google-io-meta.gif" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg9fkaR659shSkWXUVdlrR6N6JnD_3c1cFOV-x4wkpSgF01807L3vICUfsA45R-A-B1r2AtHdwkUnC4XKpvD5G2p-FjTsF177qBpFyhBJtQ0Z7cZiPdxRZkeKZv00N_pJL3Tpom6Sdx49r4FZW79uc07ov3twERtgqPiYaBLg2AWI3sONZE4pCdPqIzSv0/s1600/new-in-jetpack-compose-google-io-meta.gif" style="display: none;" />
<em>Posted by Nick Butcher – Product Manager</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMgRGZWNX77zS6mCr3oYpd37jBJerIQMhszxrDGPGVMbu5G0ov0dXJ1isuxfmPScPX47mrEGcrfTfTjVEWQxrEi74fWPmbzesAdhosiJWw8A89PByBE7XkGVoUdbx7RGuScw7ArTfAVGyrKYEsmoZuhwJ-wSzFyqT1YIG1vGiSTgOSbWJtxoCel5V9fxY/s1600/new-in-jetpack-compose.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhMgRGZWNX77zS6mCr3oYpd37jBJerIQMhszxrDGPGVMbu5G0ov0dXJ1isuxfmPScPX47mrEGcrfTfTjVEWQxrEi74fWPmbzesAdhosiJWw8A89PByBE7XkGVoUdbx7RGuScw7ArTfAVGyrKYEsmoZuhwJ-wSzFyqT1YIG1vGiSTgOSbWJtxoCel5V9fxY/s1600/new-in-jetpack-compose.png" /></a>
<p>At Google I/O 2025, we announced a host of features, performance, stability, libraries, and tools updates for <a href="https://developer.android.com/compose" target="_blank">Jetpack Compose</a>, our recommended Android UI toolkit. With Compose you can build excellent apps that work across devices. Compose has matured a lot since it was first announced (at Google I/O 2019!) and we're now seeing <a href="https://developer.android.com/compose#apps-built-with-compose" target="_blank">60% of the top 1,000 apps in the Play Store</a> such as MAX and Google Drive use and love it.</p>
<h2><span style="font-size: x-large;">New Features</span></h2>
<p>Since I/O last year, Compose Bill of Materials (BOM) version 2025.05.01 adds new features such as:</p>
<ul><ul>
<li><b>Autofill support</b> that lets users automatically insert previously entered personal information into text fields.
</li><li><b>Auto-sizing text</b> to smoothly adapt text size to a parent container size.
</li><li><b>Visibility tracking </b>for when you need high-performance information on a composable's position in its root container, screen, or window.
</li><li><b>Animate bounds modifier</b> for beautiful automatic animations of a Composable's position and size within a LookaheadScope.
</li><li><b>Accessibility checks in tests</b> that let you build a more accessible app UI through automated a11y testing.
</li></ul></ul><br />
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;">LookaheadScope {
Box(
Modifier
.animateBounds(<span style="color: green; font-weight: bold;">this</span>@LookaheadScope)
.width(<span style="color: green; font-weight: bold;">if</span>(inRow) <span style="color: #666666;">100.d</span>p <span style="color: green; font-weight: bold;">else</span> <span style="color: #666666;">150.d</span>p)
.background(..)
.border(..)
)
}
</pre></div>
<br />
<p>For more details on these features, read <a href="https://android-developers.googleblog.com/2025/04/whats-new-in-jetpack-compose-april-25.html" target="_blank">What’s new in the Jetpack Compose April ’25 release</a> and check out these talks from Google I/O:</p>
<ul><ul>
<li><a href="https://io.google/2025/explore/technical-session-16" target="_blank">Mastering text input in Compose</a></li>
<li><a href="https://io.google/2025/explore/technical-session-9" target="_blank">Build more accessible UIs with Jetpack Compose</a></li>
</ul></ul>
<p>If you’re looking to try out new Compose functionality, the <a href="https://developer.android.com/develop/ui/compose/bom#what_if_i_want_to_try_out_alpha_or_beta_releases_of_compose_libraries" target="_blank">alpha BOM</a> offers new features that we're working on including:</p>
<ul><ul>
<li>Pausable Composition (see below)</li>
<li>Updates to LazyLayout prefetch</li>
<li>Context Menus</li>
<li>New modifiers: <span style="color: #0d904f; font-family: courier;">onFirstVisible</span>, <span style="color: #0d904f; font-family: courier;">onVisbilityChanged</span>, <span style="color: #0d904f; font-family: courier;">contentType</span></li>
<li>New Lint checks for frequently changing values and elements that should be remembered in composition</li>
</ul></ul>
<p>Please try out the alpha features and <a href="https://issuetracker.google.com/issues/new?component=612128" target="_blank">provide feedback</a> to help shape the future of Compose.</p>
<h2><span style="font-size: x-large;">Material Expressive</span></h2>
<p>At Google I/O, we unveiled Material Expressive, Material Design’s latest evolution that helps you make your products even more engaging and easier to use. It's a comprehensive addition of new components, styles, motion and customization options that help you to build beautiful rich UIs. The Material3 library in the latest <a href="https://developer.android.com/develop/ui/compose/bom#what_if_i_want_to_try_out_alpha_or_beta_releases_of_compose_libraries" target="_blank">alpha BOM</a> contains many of the new expressive components for you to try out.</p>
<br />
<p>Learn more to <a href="https://m3.material.io/blog/building-with-m3-expressive" target="_blank">start building with Material Expressive</a>.</p>
<h2><span style="font-size: x-large;">Adaptive layouts library</span></h2>
<p>Developing adaptive apps across form factors including phones, foldables, tablets, desktop, cars and Android XR is now easier with the latest enhancements to the Compose adaptive layouts library. The stable <a href="https://developer.android.com/jetpack/androidx/releases/compose-material3-adaptive#1.1.0" target="_blank">1.1 release</a> adds support for predictive back gestures for smoother transitions and pane expansion for more flexible two pane layouts on larger screens. Furthermore, the <a href="https://developer.android.com/jetpack/androidx/releases/compose-material3-adaptive#compose_material3_adaptive_version_12_2" target="_blank">1.2 (alpha) release</a> adds more flexibility for how panes are displayed, adding strategies for reflowing and levitating.</p>
<br />
<p>Learn more about <a href="https://android-developers.googleblog.com/2025/05/adaptiveapps-io25.html" target="_blank">building adaptive android apps with Compose</a>.</p>
<h2><span style="font-size: x-large;">Performance</span></h2>
<p>With each release of Jetpack Compose, we continue to prioritize performance improvements. The latest stable release includes significant rewrites and improvements to multiple sub-systems including semantics, focus and text optimizations. Best of all these are available to you simply by <b>upgrading your Compose dependency;</b> no code changes required.</p>
<br />
<p>We continue to work on further performance improvements, notable changes in the latest alpha BOM include:</p>
<ul><ul>
<li><b>Pausable Composition</b> allows compositions to be paused, and their work split up over several frames.</li>
<li><b>Background text prefetch</b> enables text layout caches to be pre-warmed on a background thread, enabling faster text layout.</li>
<li><b>LazyLayout prefetch improvements</b> enabling lazy layouts to be smarter about how much content to prefetch, taking advantage of pausable composition.</li>
</ul></ul>
<p>Together these improvements eliminate nearly all jank in an internal benchmark.</p>
<h2><span style="font-size: x-large;">Stability</span></h2>
<p>We've heard from you that upgrading your Compose dependency can be challenging, encountering bugs or behaviour changes that prevent you from staying on the latest version. We've invested significantly in improving the stability of Compose, working closely with the many Google app teams building with Compose to detect and prevent issues before they even make it to a release.</p>
<p>Google apps develop against and release with snapshot builds of Compose; as such, Compose is tested against the <b>hundreds of thousands of Google app tests</b> and any Compose issues are immediately actioned by our team. We have recently invested in increasing the cadence of updating these snapshots and now update them <b>daily from Compose tip-of-tree</b>, which means we’re receiving feedback faster, and are able to resolve issues long before they reach a public release of the library.</p>
<p>Jetpack Compose also relies on <span style="color: #0d904f; font-family: courier;">@Experimental</span> annotations to mark APIs that are subject to change. We heard your feedback that some APIs have remained experimental for a long time, reducing your confidence in the stability of Compose. We have invested in stabilizing experimental APIs to provide you a more solid API surface, and <b>reduced the number of experimental APIs by 32% in the last year</b>.</p>
<p>We have also heard that it can be hard to debug Compose crashes when your own code does not appear in the stack trace. In the latest alpha BOM, we have added a new <a href="https://developer.android.com/reference/kotlin/androidx/compose/runtime/Composer?hl=en#setDiagnosticStackTraceEnabled%28kotlin.Boolean%29" target="_blank">opt-in feature</a> to provide more diagnostic information. Note that this does not currently work with minified builds and comes at a performance cost, so we recommend only using this feature in debug builds.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">class</span> <span style="color: blue; font-weight: bold;">App</span> : Application() {
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onCreate</span>() {
<span style="color: #408080; font-style: italic;">// Enable only for debug flavor to avoid perf impact in release</span>
Composer.setDiagnosticStackTraceEnabled(BuildConfig.DEBUG)
}
}
</pre></div>
<h2><span style="font-size: x-large;">Libraries</span></h2>
<p>We know that to build great apps, you need Compose integration in the libraries that interact with your app's UI.</p>
<p>A core library that powers any Compose app is <b>Navigation</b>. You told us that you often encountered limitations when managing state hoisting and directly manipulating the back stack with the current Compose Navigation solution. We went back to the drawing-board and completely reimagined how a navigation library should integrate with the Compose mental model. We're excited to introduce <b><a href="https://goo.gle/nav3_launch_blog" target="_blank">Navigation 3</a></b>, a new artifact designed to empower you with greater control and simplify complex navigation flows.</p>
<p>We're also investing in Compose support for <b>CameraX and Media3</b>, making it easier to integrate camera capture and video playback into your UI with Compose idiomatic components.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;">@Composable
<span style="color: green; font-weight: bold;">private</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">VideoPlayer</span>(
player: Player?, <span style="color: #408080; font-style: italic;">// from media3</span>
modifier: Modifier = Modifier
) {
Box(modifier) {
PlayerSurface(player) <span style="color: #408080; font-style: italic;">// from media3-ui-compose</span>
player?.let {
<span style="color: #408080; font-style: italic;">// custom play-pause button UI</span>
<span style="color: green; font-weight: bold;">val</span> playPauseButtonState = rememberPlayPauseButtonState(it) <span style="color: #408080; font-style: italic;">// from media3-ui-compose</span>
MyPlayPauseButton(playPauseButtonState, Modifier.align(BottomEnd).padding(<span style="color: #666666;">16.d</span>p))
}
}
}
</pre></div><br/>
To learn more, see the <a href="https://developer.android.com/media/media3/ui/compose" target="_blank">media3 Compose documentation</a> and the <a href="https://github.com/android/platform-samples/tree/main/samples/camera/camerax/src/main/java/com/example/platform/camerax/basic/CameraXBasic.kt" target="_blank">CameraX samples</a>.
<h2><span style="font-size: x-large;">Tools</span></h2>
<p>We continue to improve the Android Studio tools for creating Compose UIs. The <a href="https://developer.android.com/studio/preview" target="_blank">latest Narwhal canary</a> includes:</p>
<ul><ul>
<li><b>Resizable Previews</b> instantly show you how your Compose UI adapts to different window sizes
</li><li><b>Preview navigation improvements</b> using clickable names and components
</li><li><b>Studio Labs</b> 🧪: <b>Compose preview generation with Gemini</b> quickly generate a preview
</li><li><b>Studio Labs</b> 🧪: <b>Transform UI with Gemini</b> change your UI with natural language, directly from preview.
</li><li><b>Studio Labs</b> 🧪: <b>Image attachment in Gemini</b> generate Compose code from images.
</li></ul></ul>
<p>For more information read <a href="https://android-developers.googleblog.com/2025/05/google-io-2025-whats-new-in-android-development-tools.html" target="_blank">What's new in Android development tools</a>.</p>
<br />
<h2><span style="font-size: x-large;">New Compose Lint checks</span></h2>
<p>The Compose alpha BOM introduces two new annotations and associated lint checks to help you to write correct and performant Compose code. The <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/runtime/annotation/FrequentlyChangingValue" target="_blank">@FrequentlyChangingValue</a></span> annotation and <span style="color: #0d904f; font-family: courier;">FrequentlyChangedStateReadInComposition</span> lint check warns in situations where function calls or property reads in composition might cause frequent recompositions. For example, frequent recompositions might happen when reading scroll position values or animating values. The <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/runtime/annotation/RememberInComposition" target="_blank">@RememberInComposition</a></span> annotation and <span style="color: #0d904f; font-family: courier;">RememberInCompositionDetector</span> lint check warns in situations where constructors, functions, and property getters are called directly inside composition (e.g. the <a href="https://developer.android.com/reference/kotlin/androidx/compose/foundation/text/input/TextFieldState?hl=en&_gl=1*8wj983*_up*MQ..*_ga*MTM3MDAzNTAwMS4xNzQ2Nzg3NTgx*_ga_6HH9YJMN9M*czE3NDY3ODc1ODEkbzEkZzAkdDE3NDY3ODc1ODEkajAkbDAkaDEwMDgyMDU2MDM.#TextFieldState%28kotlin.String,androidx.compose.ui.text.TextRange%29" target="_blank"><span style="font-family: courier;">TextFieldState</span> constructor</a>) without being remembered.</p>
<h2><span style="font-size: x-large;">Happy Composing</span></h2>
<p>We continue to invest in providing the features, performance, stability, libraries and tools that you need to build excellent apps. We value your input so please <a href="https://issuetracker.google.com/issues/new?component=612128" target="_blank">share feedback</a> on our latest updates or what you'd like to see next.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-76904641001665297102025-05-20T10:59:00.000-07:002025-05-23T12:09:52.310-07:00Updates to the Android XR SDK: Introducing Developer Preview 2<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6E4IA0G-nsPWLYDP3V2rxang1_wR1bhfP5EeKqTJZlPRrpXBLDG5kKG3qli7-pYqmAo2wdu9p3zq8tUy3v6Ko0CbXu-jri5lNbCoR0fLLPTpKnum8zRLWzLcYmZ2qVd0_l0TFyzlHufD3J8Nn-iaD6M215KpYtHZXGhDTgN4vcIMSyvhK522Xnmowzdw/s1600/android-xr-google-io-meta.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj6E4IA0G-nsPWLYDP3V2rxang1_wR1bhfP5EeKqTJZlPRrpXBLDG5kKG3qli7-pYqmAo2wdu9p3zq8tUy3v6Ko0CbXu-jri5lNbCoR0fLLPTpKnum8zRLWzLcYmZ2qVd0_l0TFyzlHufD3J8Nn-iaD6M215KpYtHZXGhDTgN4vcIMSyvhK522Xnmowzdw/s1600/android-xr-google-io-meta.png" style="display: none;" />
<em>Posted by Matthew McCullough – VP of Product Management, Android Developer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjPp6HgZqiN1uweKBPa65S2eBZNgSTYHCRswoCxX2JMlwBW7IcDQV_7K13IrOamurxjeYJwg_zejYbRQMQwyLuO4dX08M2JG2l83byhJXkLtUR5SOkl3FZb_x8oX6aTaC-3f6jaA3Z7C_ReYJWNud31TJL2sdgUEmeKERNAWiUpdtbozALI5UYp9nCattI/s1600/IO25-Blog-Hero-Template-Art-Long-01.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjPp6HgZqiN1uweKBPa65S2eBZNgSTYHCRswoCxX2JMlwBW7IcDQV_7K13IrOamurxjeYJwg_zejYbRQMQwyLuO4dX08M2JG2l83byhJXkLtUR5SOkl3FZb_x8oX6aTaC-3f6jaA3Z7C_ReYJWNud31TJL2sdgUEmeKERNAWiUpdtbozALI5UYp9nCattI/s1600/IO25-Blog-Hero-Template-Art-Long-01.png" /></a>
<p>Since launching the <a href="https://android-developers.googleblog.com/2024/12/introducing-android-xr-sdk-developer-preview.html" target="_blank">Android XR SDK Developer Preview</a> alongside Samsung, Qualcomm, and Unity last year, we’ve been blown away by all of the excitement we’ve been hearing from the broader Android community. Whether it's through <a href=" https://www.youtube.com/watch?v=AkKjMtBYwDA&t=116s" target="_blank">coding live-streams</a> or local <a href="https://www.youtube.com/watch?v=RsFL8wvZFK8" target="_blank">Google Developer Group talks</a>, it's been an outstanding experience participating in the community to build the future of XR together, and we're just getting started.</p>
<p>Today we’re excited to share an update to the <a href="http://developer.android.com/xr" target="_blank">Android XR SDK</a>: Developer Preview 2, packed with new features and improvements to help you develop helpful and delightful immersive experiences with familiar Android APIs, tools and open standards created for XR.</p>
<p>At Google I/O, we have two technical sessions related to Android XR. The first is <a href="https://io.google/2025/explore/technical-session-22" target="_blank">Building differentiated apps for Android XR with 3D content</a>, which covers many features present in Jetpack SceneCore and ARCore for Jetpack XR. <a href="https://io.google/2025/explore/technical-session-2" target="_blank">The future is now, with Compose and AI on Android XR</a> covers creating XR-differentiated UI and our vision on the intersection of XR with cutting-edge AI capabilities.</p>

<h2><span style="font-size: x-large;">What’s new in Developer Preview 2</span></h2>
<p>Since the release of <a href="https://developers.googleblog.com/en/introducing-android-xr-sdk-developer-preview/" target="_blank">Developer Preview 1</a>, we’ve been focused on making the APIs easier to use and adding new immersive Android XR features. Your feedback has helped us shape the development of the tools, SDKs, and the platform itself.</p>
<p>With the <b>Jetpack XR SDK</b>, you can now play back 180° and 360° videos, which can be stereoscopic by encoding with the MV-HEVC specification or by encoding view-frames adjacently. The MV-HEVC standard is optimized and designed for stereoscopic video, allowing your app to efficiently play back immersive videos at great quality. Apps built with Jetpack Compose for XR can use the <span style="font-family: courier;"><a href="https://developer.android.com/develop/xr/jetpack-xr-sdk/develop-ui#add-surface" target="_blank">SpatialExternalSurface</a></span> composable to render media, including stereoscopic videos.</p>
<p>Using <b>Jetpack Compose for XR</b>, you can now also define layouts that adapt to different XR display configurations. For example, use a <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/xr/compose/subspace/layout/SubspaceModifier#%28androidx.xr.compose.subspace.layout.SubspaceModifier%29.fillMaxSize%28kotlin.Float%29" target="_blank">SubspaceModifier</a></span> to specify the size of a <span style="font-family: courier;"><a href="https://developer.android.com/develop/xr/jetpack-xr-sdk/add-subspace" target="_blank">Subspace</a></span> as a percentage of the device’s recommended viewing size, so a panel effortlessly fills the space it's positioned in.</p>
<p><b>Material Design for XR</b> now supports more component overrides for <span style="font-family: courier;"><a href="https://developer.android.com/develop/ui/compose/components/app-bars" target="_blank">TopAppBar</a></span>, <span style="font-family: courier;"><a href="https://developer.android.com/develop/ui/compose/components/dialog#alert" target="_blank">AlertDialog</a></span>, and <span style="font-family: courier;"><a href="https://developer.android.com/develop/ui/compose/layouts/adaptive/list-detail" target="_blank">ListDetailPaneScaffold</a></span>, helping your large-screen enabled apps that use Material Design effortlessly adapt to the new world of XR.</p>
<br />
<p>In <b>ARCore for Jetpack XR</b>, you can now track hands after requesting the appropriate permissions. Hands are a collection of 26 posed hand joints that can be used to detect hand gestures and bring a whole new level of interaction to your Android XR apps:</p>
<br />
<p>For more guidance on developing apps for Android XR, check out our <a href="https://developer.android.com/codelabs/xr-fundamentals-part-1" target="_blank">Android XR Fundamentals codelab</a>, the updates to our <a href="http://goo.gle/haxr" target="_blank">Hello Android XR sample project</a>, and <a href="http://goo.gle/adaptive-jetstream" target="_blank">a new version of JetStream</a> with Android XR support.</p>
<p>The <b>Android XR Emulator</b> has also received updates to stability, support for AMD GPUs, and is now fully integrated within the Android Studio UI.</p>
<br />
<p>Developers using Unity have <a href="https://unity.com/blog/porting-apps-games-over-android-xr-unity-6" target="_blank">already successfully created and ported existing games and apps to Android XR</a>. Today, you can upgrade to the <a href="https://discussions.unity.com/t/android-xr-pre-release-now-available/1634938" target="_blank">Pre-Release version 2</a> of the Unity OpenXR: Android XR package! This update adds many performance improvements such as support for <a href="https://docs.unity3d.com/Packages/[email protected]/manual/features/display-utilities.html" target="_blank">Dynamic Refresh Rate</a>, which optimizes your app’s performance and power consumption. Shaders made with <a href="https://unity.com/features/shader-graph" target="_blank">Shader Graph</a> now support <a href="https://docs.unity3d.com/6000.1/Documentation/Manual/xr-graphics-spacewarp.html" target="_blank">SpaceWarp</a>, making it easier to use SpaceWarp to reduce compute load on the device. Hand meshes are now exposed with occlusion, which enables realistic hand visualization.</p>
<p>Check out Unity’s <a href="https://docs.unity3d.com/Packages/[email protected]/manual/index.html" target="_blank">improved Mixed Reality template</a> for Android XR, which now includes support for occlusion and persistent anchors.</p>
<p>We recently launched <a href="https://github.com/android/xr-unity-samples" target="_blank">Android XR Samples for Unity</a>, which demonstrate capabilities on the Android XR platform such as hand tracking, plane tracking, face tracking, and passthrough.</p>
<br />
<p>The Firebase AI Logic for Unity is now in public preview! This makes it easy for you to integrate gen AI into your apps, enabling the creation of AI-powered experiences with Gemini and Android XR. The Firebase AI Logic fully supports Gemini's capabilities, including multimodal input and output, and bi-directional streaming for immersive conversational interfaces. Built with production readiness in mind, Firebase AI Logic is integrated with core Firebase services like App Check, Remote Config, and Cloud Storage for enhanced security, configurability, and data management. Learn more about this on <a href="https://firebase.blog/posts/2025/05/ai-logic-unity-androidxr" target="_blank">the Firebase blog</a> or go straight to the <a href="https://firebase.google.com/docs/vertex-ai/get-started" target="_blank">Gemini API using Vertex AI in Firebase SDK documentation</a> to get started.</p>
<h2><span style="font-size: x-large;">Continuing to build the future together</span></h2>
<p>Our commitment to open standards continues with the <a href="https://www.khronos.org/blog/gltf-interactivity-specification-released-for-public-comment" target="_blank">glTF Interactivity specification</a>, in collaboration with the Khronos Group. which will be supported in glTF models rendered by Jetpack XR later this year. Models using the glTF Interactivity specification are self-contained interactive assets that can have many pre-programmed behaviors, like rotating objects on a button press or changing the color of a material over time.</p>
<p>Android XR will be available first on Samsung’s Project Moohan, launching later this year. Soon after, our partners at XREAL will release the next Android XR device. Codenamed Project Aura, it’s a portable and tethered device that gives users access to their favorite Android apps, including those that have been built for XR. It will launch as a developer edition, specifically for you to begin creating and experimenting. The best news? With the familiar tools you use to build Android apps today, you can build for these devices too.</p>
<br />
<p>The Google Play Store is also getting ready for Android XR. It will list <a href="https://developer.android.com/develop/xr/get-started#app_manifest_compatibility_considerations_for_mobile_and_large_screen_apps" target="_blank">supported 2D Android apps</a> on the Android XR Play Store when it launches later this year. If you are working on an <a href="https://developer.android.com/docs/quality-guidelines/android-xr#android-xr-differentiated" target="_blank">Android XR differentiated app</a>, you can get it ready for the big launch and be one of the first differentiated apps on the Android XR Play Store:</p>
<ul><ul>
<li>Install and test your existing app in the <a href="https://developer.android.com/develop/xr/jetpack-xr-sdk/studio-tools" target="_blank">Android XR Emulator</a></li>
<li>Learn how to <a href="https://developer.android.com/develop/xr/package-and-distribute" target="_blank">package and distribute apps for Android XR</a></li>
<li>New! <a href="https://support.google.com/googleplay/android-developer/answer/9866151?hl=en&ref_topic=3450987&sjid=1442238423744171917-NA" target="_blank">Make your XR app stand out</a> from others on Play Store with preview assets such as stereoscopic 180° or 360° videos, as well as screenshots, app description, and non-spatial video.</li>
</ul></ul>
<p>And we know many of you are excited for the future of Android XR on <a href="https://blog.google/products/android/android-xr-gemini-glasses-headsets/" target="_blank">glasses</a>. We are shaping the developer experience now and will share more details on how you can participate later this year.</p>
<p>To get started creating and developing for Android XR, check out <a href="https://d.android.com/develop/xr" target="_blank">developer.android.com/develop/xr</a> where you will find all of the tools, libraries, and resources you need to work with the Android XR SDK. In particular, try out our <a href="https://developer.android.com/develop/xr/samples" target="_blank">samples</a> and <a href="https://developer.android.com/codelabs/xr-fundamentals-part-1#0" target="_blank">codelabs</a>.</p>
<p>We welcome your <a href="https://d.android.com/develop/xr/support" target="_blank">feedback, suggestions, and ideas</a> as you’re helping shape Android XR. Your passion, expertise, and bold ideas are vital as we continue to develop Android XR together. We look forward to seeing your XR-differentiated apps when Android XR devices launch later this year!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-31437695330202848662025-05-20T10:58:00.000-07:002025-05-20T12:38:22.999-07:00Peacock built adaptively on Android to deliver great experiences across screens <meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixLVwc_aeyW7UAK0sMPkC2fx3szkb0xOx4g9txyLJ15pzekTIx4-fbHcmPi0bQ6gJlpR4s4MKRQWmKd6zpooI4NYYrZiVUIJC36DEmAu0Yg-yZD76Zu3W6yJnQT2zDhj2d8bSAkdGEgWdSezKcIMO03TwDVOdRUbfUSVOzTaMo2tUWuNN3ZH8XcGdFNcM/s1600/peacock-adaptive-android-case-study.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixLVwc_aeyW7UAK0sMPkC2fx3szkb0xOx4g9txyLJ15pzekTIx4-fbHcmPi0bQ6gJlpR4s4MKRQWmKd6zpooI4NYYrZiVUIJC36DEmAu0Yg-yZD76Zu3W6yJnQT2zDhj2d8bSAkdGEgWdSezKcIMO03TwDVOdRUbfUSVOzTaMo2tUWuNN3ZH8XcGdFNcM/s1600/peacock-adaptive-android-case-study.png">
<em>Posted by Sa-ryong Kang and Miguel Montemayor - Developer Relations Engineers
</em>
<a href="IMG" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEixLVwc_aeyW7UAK0sMPkC2fx3szkb0xOx4g9txyLJ15pzekTIx4-fbHcmPi0bQ6gJlpR4s4MKRQWmKd6zpooI4NYYrZiVUIJC36DEmAu0Yg-yZD76Zu3W6yJnQT2zDhj2d8bSAkdGEgWdSezKcIMO03TwDVOdRUbfUSVOzTaMo2tUWuNN3ZH8XcGdFNcM/s1600/peacock-adaptive-android-case-study.png" data-original-width="100%" data-original-height="800" /></a>
<p><a href="https://play.google.com/store/apps/details?id=com.peacocktv.peacockandroid&hl=en_US&pli=1" target="_blank">Peacock</a> is <a href="https://www.nbcuniversal.com/" target="_blank">NBCUniversal’s</a> streaming service app available in the US, offering culture-defining entertainment including live sports, exclusive original content, TV shows, and blockbuster movies. The app continues to evolve, becoming more than just a platform to watch content, but a hub of entertainment.</p>
<p>Today’s users are consuming entertainment on an increasingly wider array of device sizes and types, and in particular are moving towards mobile devices. Peacock has adopted Jetpack Compose to help with its journey in adapting to more screens and meeting users where they are.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="ooRcQFMYzmA" width="100%" height="498" src="https://www.youtube.com/embed/ooRcQFMYzmA"></iframe><imgcaption><center><em><b>Disclaimer:</b> Peacock is available in the US only. This video will only be viewable to US viewers.</em></center></imgcaption><br/>
<h2><span style="font-size: x-large;">Adapting to more flexible form factors</span></h2>
<p>The Peacock development team is focused on bringing the best experience to users, no matter what device they’re using or when they want to consume content. With an emerging trend from app users to watch more on mobile devices and large screens like foldables, the Peacock app needs to be able to <a href="https://developer.android.com/adaptive-apps" target="_blank">adapt to different screen sizes</a>. As more devices are introduced, the team needed to explore new solutions that make the most out of each unique display permutation.</p>
<p>The goal was to have the Peacock app to adapt to these new displays while continually offering high-quality entertainment without interruptions, like the stream reloading or visual errors. While thinking ahead, they also wanted to prepare and build a solution that was ready for <a href="https://developer.android.com/develop/xr/get-started" target="_blank">Android XR</a> as the entertainment landscape is shifting towards including more immersive experiences.</p>
<br/>
<h2><span style="font-size: x-large;">Building a future-proof experience with Jetpack Compose</span></h2>
<p>In order to build a scalable solution that would help the Peacock app continue to evolve, the app was migrated to Jetpack Compose, Android’s toolkit for building scalable UI. One of the essential tools they used was the <a href="https://developer.android.com/develop/ui/compose/layouts/adaptive/use-window-size-classes" target="_blank">WindowSizeClass API</a>, which helps developers create and test UI layouts for different size ranges. This API then allows the app to seamlessly switch between pre-set layouts as it reaches established viewport breakpoints for different window sizes.</p>
<p>The API was used in conjunction with <a href="https://developer.android.com/kotlin/coroutines" target="_blank">Kotlin Coroutines</a> and <a href="https://developer.android.com/kotlin/flow" target="_blank">Flows</a> to keep the UI state responsive as the window size changed. To test their work and fine tune edge case devices, Peacock used the Android Studio emulator to simulate a wide range of Android-based devices.</p>
<p>Jetpack Compose allowed the team to build adaptively, so now the Peacock app responds to a wide variety of screens while offering a seamless experience to Android users. “The app feels more native, more fluid, and more intuitive across all form factors,” said Diego Valente, Head of Mobile, Peacock and Global Streaming. “That means users can start watching on a smaller screen and continue instantly on a larger one when they unfold the device—no reloads, no friction. It just works.”</p>
<h2><span style="font-size: x-large;">Preparing for immersive entertainment experiences</span></h2>
<p>In building adaptive apps on Android, John Jelley, Senior Vice President, Product & UX, Peacock and Global Streaming, says Peacock has also laid the groundwork to quickly adapt to the Android XR platform: “Android XR builds on the same large screen principles, our investment here naturally extends to those emerging experiences with less developmental work.”</p>
<p>The team is excited about the prospect of features unlocked by Android XR, like Multiview for sports and TV, which enables users to watch multiple games or camera angles at once. By tailoring spatial windows to the user’s environment, the app could offer new ways for users to interact with contextual metadata like sports stats or actor information—all without ever interrupting their experience.</p>
<h2><span style="font-size: x-large;">Build adaptive apps</span></h2>
<p>Learn how to <a href="https://developer.android.com/adaptive-apps" target="_blank">unlock your app's full potential</a> on phones, tablets, foldables, and beyond.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-61655085035388003202025-05-20T10:57:00.000-07:002025-05-20T12:38:53.265-07:00On-device GenAI APIs as part of ML Kit help you easily build with Gemini Nano<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh96stpV5BvqzxBhk12xYFEg5g-5U4H93DJOId6MEDioi_kPsXWnW3tBwd_kYQfIrcaD4h6QF3cylN62tkiTMit-K6haiB8QSJ2Lnp9ggL_bY_hNwu-FX3HquDmZ98rMYoyug6PxJ0qQQ2_7hlB1BfhZXvh8IFweZJDRbplE0CdwrjjOsMH1GKfC_6hHXo/s1600/gen-ai-api-android-meta%20%283%29.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh96stpV5BvqzxBhk12xYFEg5g-5U4H93DJOId6MEDioi_kPsXWnW3tBwd_kYQfIrcaD4h6QF3cylN62tkiTMit-K6haiB8QSJ2Lnp9ggL_bY_hNwu-FX3HquDmZ98rMYoyug6PxJ0qQQ2_7hlB1BfhZXvh8IFweZJDRbplE0CdwrjjOsMH1GKfC_6hHXo/s1600/gen-ai-api-android-meta%20%283%29.png" style="display: none;" />
<em>Posted by Caren Chang - Developer Relations Engineer, Chengji Yan - Software Engineer, Taj Darra - Product Manager</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUnecK6GCD0zX60dSKM1fspl7RcCZh3kaWG-PYKw3Yf4Vyf_3NXlsQoRAVQ_milOQqfBS6gtDODFwK5b9SskqFT7tu6fYDr8WcqxuJQxqIrA9L8VVVsxgSGJogtFqgFe0wVZswvVjTU1Zuf1-ZkAT_bPXL29iIVooQzoStIDZEbaVA0ygvoggMoMsCXrM/s1600/gen-ai-api-android-hero%20%281%29.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjUnecK6GCD0zX60dSKM1fspl7RcCZh3kaWG-PYKw3Yf4Vyf_3NXlsQoRAVQ_milOQqfBS6gtDODFwK5b9SskqFT7tu6fYDr8WcqxuJQxqIrA9L8VVVsxgSGJogtFqgFe0wVZswvVjTU1Zuf1-ZkAT_bPXL29iIVooQzoStIDZEbaVA0ygvoggMoMsCXrM/s1600/gen-ai-api-android-hero%20%281%29.png" /></a>
<p>We are excited to announce a set of on-device <a href="https://developers.google.com/ml-kit/genai" target="_blank">GenAI APIs, as part of ML Kit</a>, to help you integrate Gemini Nano in your Android apps.</p>
<p>To start, we are releasing 4 new APIs:</p>
<ul><ul>
<li><b>Summarization</b>: to summarize articles and conversations</li>
<li><b>Proofreading</b>: to polish short text</li>
<li><b>Rewriting</b>: to reword text in different styles</li>
<li><b>Image Description</b>: to provide short description for images</li>
</ul></ul>
<h2><span style="font-size: x-large;">Key benefits of GenAI APIs</span></h2>
<p>GenAI APIs are high level APIs that allow for easy integration, similar to existing <a href="https://developers.google.com/ml-kit" target="_blank">ML Kit APIs</a>. This means you can expect quality results out of the box without extra effort for prompt engineering or fine tuning for specific use cases.</p>
<p>GenAI APIs run on-device and thus provide the following benefits:</p>
<ul><ul>
<li>Input, inference, and output data is processed locally</li>
<li>Functionality remains the same without reliable internet connection</li>
<li>No additional cost incurred for each API call</li>
</ul></ul>
<p>To prevent misuse, we also added safety protection in various layers, including base model training, safety-aware LoRA fine-tuning, input and output classifiers and safety evaluations.</p>
<h2><span style="font-size: x-large;">How GenAI APIs are built</span></h2>
<p>There are 4 main components that make up each of the GenAI APIs.</p>
<ol>
<li>Gemini Nano is the base model, as the foundation shared by all APIs.</li>
<li>Small API-specific LoRA adapter models are trained and deployed on top of the base model to further improve the quality for each API.</li>
<li>Optimized inference parameters (e.g. prompt, temperature, topK, batch size) are tuned for each API to guide the model in returning the best results.</li>
<li>An evaluation pipeline ensures quality in various datasets and attributes. This pipeline consists of: LLM raters, statistical metrics and human raters.</li>
</ol>
<p>Together, these components make up the high-level GenAI APIs that simplify the effort needed to integrate Gemini Nano in your Android app.</p>
<h2><span style="font-size: x-large;">Evaluating quality of GenAI APIs</span></h2>
<p>For each API, we formulate a benchmark score based on the evaluation pipeline mentioned above. This score is based on attributes specific to a task. For example, when evaluating the summarization task, one of the attributes we look at is “grounding” (ie: factual consistency of generated summary with source content).</p>
<p>To provide out-of-box quality for GenAI APIs, we applied feature specific fine-tuning on top of the Gemini Nano base model. This resulted in an increase for the benchmark score of each API as shown below:</p>
<table style="width: 100%;">
<tbody><tr>
<th style="text-align: left;"><b>Use case in English</b></th>
<th style="text-align: left;">Gemini Nano Base Model</th>
<th style="text-align: left;">ML Kit GenAI API</th>
</tr>
<tr>
<td style="text-align: left;">Summarization</td>
<td style="text-align: left;">77.2</td>
<td style="text-align: left;">92.1</td>
</tr>
<tr>
<td style="text-align: left;">Proofreading</td>
<td style="text-align: left;">84.3</td>
<td style="text-align: left;">90.2</td>
</tr>
<tr>
<td style="text-align: left;">Rewriting</td>
<td style="text-align: left;">79.5</td>
<td style="text-align: left;">84.1</td>
</tr>
<tr>
<td style="text-align: left;">Image Description</td>
<td style="text-align: left;">86.9</td>
<td style="text-align: left;">92.3</td>
</tr>
</tbody></table><br />
<p>In addition, this is a quick reference of how the APIs perform on a Pixel 9 Pro:</p>
<table style="width: 100%;">
<tbody><tr>
<th style="text-align: left;"></th>
<th style="text-align: left;"><b>Prefix Speed<br />(input processing rate)</b></th>
<th style="text-align: left;"><b>Decode Speed<br />(output generation rate)</b></th>
</tr>
<tr>
<td style="text-align: left; width: 20%;">Text-to-text</td>
<td style="text-align: left; width: 40%;">510 tokens/second</td>
<td style="text-align: left; width: 40%;">11 tokens/second</td>
</tr>
<tr>
<td style="text-align: left; width: 20%;">Image-to-text</td>
<td style="text-align: left; width: 40%;">510 tokens/second + 0.8 seconds for image encoding</td>
<td style="text-align: left; width: 40%;">11 tokens/second</td>
</tr>
</tbody></table>
<h2><span style="font-size: x-large;">Sample usage</span></h2>
<p>This is an example of implementing the GenAI Summarization API to get a one-bullet summary of an article:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> articleToSummarize = <span style="color: #ba2121;">"We are excited to announce a set of on-device generative AI APIs..."</span>
<span style="color: #408080; font-style: italic;">// Define task with desired input and output format</span>
<span style="color: green; font-weight: bold;">val</span> summarizerOptions = SummarizerOptions.builder(context)
.setInputType(InputType.ARTICLE)
.setOutputType(OutputType.ONE_BULLET)
.setLanguage(Language.ENGLISH)
.build()
<span style="color: green; font-weight: bold;">val</span> summarizer = Summarization.getClient(summarizerOptions)
suspend <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">prepareAndStartSummarization</span>(context: Context) {
<span style="color: #408080; font-style: italic;">// Check feature availability. Status will be one of the following: </span>
<span style="color: #408080; font-style: italic;">// UNAVAILABLE, DOWNLOADABLE, DOWNLOADING, AVAILABLE</span>
<span style="color: green; font-weight: bold;">val</span> featureStatus = summarizer.checkFeatureStatus().await()
<span style="color: green; font-weight: bold;">if</span> (featureStatus == FeatureStatus.DOWNLOADABLE) {
<span style="color: #408080; font-style: italic;">// Download feature if necessary.</span>
<span style="color: #408080; font-style: italic;">// If downloadFeature is not called, the first inference request will </span>
<span style="color: #408080; font-style: italic;">// also trigger the feature to be downloaded if it's not already</span>
<span style="color: #408080; font-style: italic;">// downloaded.</span>
summarizer.downloadFeature(object : DownloadCallback {
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onDownloadStarted</span>(bytesToDownload: Long) { }
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onDownloadFailed</span>(e: GenAiException) { }
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onDownloadProgress</span>(totalBytesDownloaded: Long) {}
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onDownloadCompleted</span>() {
startSummarizationRequest(articleToSummarize, summarizer)
}
})
} <span style="color: green; font-weight: bold;">else</span> <span style="color: green; font-weight: bold;">if</span> (featureStatus == FeatureStatus.DOWNLOADING) {
<span style="color: #408080; font-style: italic;">// Inference request will automatically run once feature is </span>
<span style="color: #408080; font-style: italic;">// downloaded.</span>
<span style="color: #408080; font-style: italic;">// If Gemini Nano is already downloaded on the device, the </span>
<span style="color: #408080; font-style: italic;">// feature-specific LoRA adapter model will be downloaded very </span>
<span style="color: #408080; font-style: italic;">// quickly. However, if Gemini Nano is not already downloaded, </span>
<span style="color: #408080; font-style: italic;">// the download process may take longer.</span>
startSummarizationRequest(articleToSummarize, summarizer)
} <span style="color: green; font-weight: bold;">else</span> <span style="color: green; font-weight: bold;">if</span> (featureStatus == FeatureStatus.AVAILABLE) {
startSummarizationRequest(articleToSummarize, summarizer)
}
}
<span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">startSummarizationRequest</span>(text: String, summarizer: Summarizer) {
<span style="color: #408080; font-style: italic;">// Create task request </span>
<span style="color: green; font-weight: bold;">val</span> summarizationRequest = SummarizationRequest.builder(text).build()
<span style="color: #408080; font-style: italic;">// Start summarization request with streaming response</span>
summarizer.runInference(summarizationRequest) { newText ->
<span style="color: #408080; font-style: italic;">// Show new text in UI</span>
}
<span style="color: #408080; font-style: italic;">// You can also get a non-streaming response from the request</span>
<span style="color: #408080; font-style: italic;">// val summarizationResult = summarizer.runInference(summarizationRequest)</span>
<span style="color: #408080; font-style: italic;">// val summary = summarizationResult.get().summary</span>
}
<span style="color: #408080; font-style: italic;">// Be sure to release the resource when no longer needed</span>
<span style="color: #408080; font-style: italic;">// For example, on viewModel.onCleared() or activity.onDestroy()</span>
summarizer.close()
</pre></div><br />
<p>For more examples of implementing the GenAI APIs, check out the official <a href="https://developers.google.com/ml-kit/genai" target="_blank">documentation</a> and samples on GitHub:</p>
<ul><ul>
<li><a href="https://github.com/android/ai-samples/tree/main/ai-catalog/samples" target="_blank">AI Catalog GenAI API Samples with Compose</a></li>
<li><a href="https://github.com/googlesamples/mlkit/tree/master/android/genai" target="_blank">ML Kit GenAI APIs Quickstart</a></li>
</ul></ul>
<h2><span style="font-size: x-large;">Use cases</span></h2>
<p>Here is some guidance on how to best use the current GenAI APIs:</p>
<p>For <b>Summarization</b>, consider:</p>
<ul><ul>
<li>Conversation messages or transcripts that involve 2 or more users</li></ul><ul>
<li>Articles or documents less than 4000 tokens (or about 3000 English words). Using the first few paragraphs for summarization is usually good enough to capture the most important information.</li>
</ul></ul>
<p>For <b>Proofreading</b> and <b>Rewriting</b> APIs, consider utilizing them during the content creation process for short content below 256 tokens to help with tasks such as:</p>
<ul><ul>
<li>Refining messages in a particular tone, such as more formal or more casual</li></ul><ul>
<li>Polishing personal notes for easier consumption later</li></ul><ul>
</ul></ul>
<p>For the <b>Image Description</b> API, consider it for:</p>
<ul><ul>
<li>Generating titles of images</li></ul><ul>
<li>Generating metadata for image search</li></ul><ul>
<li>Utilizing descriptions of images in use cases where the images themselves cannot be displayed, such as within a list of chat messages</li></ul><ul>
<li>Generating alternative text to help visually impaired users better understand content as a whole</li>
</ul></ul>
<h2><span style="font-size: x-large;">GenAI API in production</span></h2>
<p>Envision is an app that verbalizes the visual world to help people who are blind or have low vision lead more independent lives. A common use case in the app is for users to take a picture to have a document read out loud. Utilizing the GenAI Summarization API, Envision is now able to get a concise summary of a captured document. This significantly enhances the user experience by allowing them to quickly grasp the main points of documents and determine if a more detailed reading is desired, saving them time and effort.</p>
<br />
<h2><span style="font-size: x-large;">Supported devices</span></h2>
<p>GenAI APIs are available on Android devices using optimized MediaTek Dimensity, Qualcomm Snapdragon, and Google Tensor platforms through AICore. For a comprehensive list of devices that support GenAI APIs, refer to our <a href="https://developers.google.com/ml-kit/genai#device-support" target="_blank">official documentation</a>.</p>
<h2><span style="font-size: x-large;">Learn more</span></h2>
<p>Start implementing GenAI APIs in your Android apps today with guidance from our official <a href="https://developers.google.com/ml-kit/genai" target="_blank">documentation</a> and samples on GitHub: <a href="https://github.com/android/ai-samples/tree/main/ai-catalog/samples" target="_blank">AI Catalog GenAI API Samples with Compose, ML Kit GenAI APIs Quickstart</a>.</p>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-58257665432832512872025-05-20T10:56:00.000-07:002025-05-22T10:15:07.962-07:00New in-car app experiences<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5ch4mDTckRM3NbELIn-7qkAHBZvZePkcVQTaZR1wvjp343urRIkqoKWO2qdULTfdjtX0Tcc20jiyMbP8DiMuEhuuewIEBxSTITgGYo1ZlAI0Ro5AYcJPxQcUTPLxZvKC0gzM7zVTFXEFxA3VIu4cNPhcGjvp9RqIeCccPaLclpfA5_LcSLbBsP_BdG8s/s1600/android-auto-google-io-meta.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEj5ch4mDTckRM3NbELIn-7qkAHBZvZePkcVQTaZR1wvjp343urRIkqoKWO2qdULTfdjtX0Tcc20jiyMbP8DiMuEhuuewIEBxSTITgGYo1ZlAI0Ro5AYcJPxQcUTPLxZvKC0gzM7zVTFXEFxA3VIu4cNPhcGjvp9RqIeCccPaLclpfA5_LcSLbBsP_BdG8s/s1600/android-auto-google-io-meta.png" style="display: none;" />
<em>Posted by Noam Gefen – Product Manager, Android for Cars, Sole Alborno – Product Manager, Gemini, and Ben Sagmoe – Developer Relations Engineer, Android for Cars</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwMRJDvCDNiqUTb45-PlkPAqKeZUDv4Ew0lFbkWMLy7fPELDQ3nLYAqx6RR2lbFkuTcE-hkdi_pdGQB9_BXTISrkK5xmgj3EQ6UKlCvl0vGHrZj5PxqVK01BuilgyX08Jje5Wl675eu1YfAqNu2jpZg5BmRO65yr_c1FeiecdHSFk_TGIWU_k-LCn89PA/s1600/IO25-SVD-Blog-Banner-01-4209x1253.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjwMRJDvCDNiqUTb45-PlkPAqKeZUDv4Ew0lFbkWMLy7fPELDQ3nLYAqx6RR2lbFkuTcE-hkdi_pdGQB9_BXTISrkK5xmgj3EQ6UKlCvl0vGHrZj5PxqVK01BuilgyX08Jje5Wl675eu1YfAqNu2jpZg5BmRO65yr_c1FeiecdHSFk_TGIWU_k-LCn89PA/s1600/IO25-SVD-Blog-Banner-01-4209x1253.png" /></a>
<p>The in-car experience continues to evolve rapidly, and Google remains committed to pushing the boundaries of what's possible. At Google I/O 2025, we're excited to unveil the latest advancements for drivers, car manufacturers, and developers, furthering our goal of a safe, seamless, and helpful connected driving experience.</p>
<p>Today's car cabins are increasingly digital, offering developers exciting new opportunities with larger displays and more powerful computing. <a href="https://www.android.com/auto/" target="_blank">Android Auto</a> is now supported in nearly all new cars sold, with almost 250 million compatible vehicles on the road.</p>
<p>We're also seeing significant growth in cars powered by Android Automotive OS with <a href="https://built-in.google/cars/" target="_blank">Google built-in</a>. Over 50 models are currently available, with more launching this year. This growth is fueled by a thriving app ecosystem, including over 300 apps already available on the Play Store. These include apps optimized for a safe and seamless experience while driving as well as entertainment apps for while you're parked and waiting in your car—many of which are adaptive mobile apps that have been seamlessly brought to cars through the <a href="https://developer.android.com/training/cars/car-ready-mobile-apps" target="_blank">Car Ready Mobile Apps Program</a>.</p>
<p>A vibrant developer community is essential to delivering these innovative in-car experiences utilizing the different screens within the car cabin. This past year, we've focused on key areas to help empower developers to build more differentiated experiences in cars across both platforms, as we embark on the Gemini era in cars.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="ud09zuXHst4" width="100%" height="413" src="https://www.youtube.com/embed/ud09zuXHst4"></iframe>
<h2><span style="font-size: x-large;">Gemini for Cars</span></h2>
<p>Exciting news for in-car experiences: Gemini, Google's advanced AI, is coming to vehicles! This unlocks a new era of safe and helpful interactions on the go.</p>
<p>Gemini enables natural voice conversations and seamless multitasking, empowering drivers to get more done simply by speaking naturally. Imagine effortlessly finding charging stations or navigating to a location pulled directly from an email, all with just your voice.</p>
<p>You can learn how to <a href="https://blog.google/products/android/gemini-for-cars/" target="_blank">leverage Gemini's potential to create engaging in-car experiences in your app</a>.</p>
<p>Navigation apps can integrate with Gemini using <a href="https://developer.android.com/training/cars/apps/navigation#support-navigation-intents" target="_blank">three core intent formats</a>, allowing you to start navigation, display relevant search results, and execute custom actions, such as enabling users to report incidents like traffic congestion using their voice.</p>
<p>Gemini for cars will be rolling out in the coming months. Get ready to build the next generation of in-car AI experiences!</p>
<h2><span style="font-size: x-large;">New developer programs and tools</span></h2>
<br />
<p>Last year, we introduced car app quality tiers to inspire developers to create high quality in-car experiences. By developing your app in compliance with the Car ready tier, you can bring video, gaming, or browser apps to run while parked in cars with Google built-in with almost no additional effort. <a href="https://developer.android.com/training/cars/car-ready-mobile-apps" target="_blank">Learn more about Car Ready Mobile Apps</a>.
</p><p>Your app can further shine in cars within the Car optimized and Car differentiated tiers to unlock experiences while the car is in motion, and also when transitioning between parked and driving modes, while utilizing the different screens within the modern car cabin. Check the <a href="https://developer.android.com/docs/quality-guidelines/car-app-quality?_gl=1*la8ojm*_up*MQ..*_ga*OTA1MjQ0MjkuMTc0Njc5NDM3NQ..*_ga_6HH9YJMN9M*czE3NDY3OTQzNzUkbzEkZzAkdDE3NDY3OTQzNzUkajAkbDAkaDEwODQ1MjQ2NTE.#car-quality-tiers" target="_blank">car app quality guidelines</a> for details.
</p><p>To start with, across both Android Auto and for cars with Google built-in, we've made some exciting improvements for <a href="https://developer.android.com/training/cars/apps" target="_blank">Car App Library</a>:</p>
<ul><ul>
<li><b>The <a href="https://developer.android.com/training/cars/apps/weather" target="_blank">Weather</a> app category</b> has graduated from beta: any developer can now publish weather apps to production tracks on both Android Auto and cars with Google Built-in. Before you publish your app, check that it meets the <a href="https://developer.android.com/docs/quality-guidelines/car-app-quality?category=weather#app_categories" target="_blank">quality guidelines for weather apps</a>.</li></ul><br /><ul>
<li><b>Designing templated apps</b> is easier than ever with the <a href="https://goo.gle/figma-car-app-design-kit" target="_blank">Car App Templates Design Kit</a> we just published on Figma.</li></ul><br /><ul>
<li>Two new templates, the <span style="color: #0d904f; font-family: courier;">SectionedItemTemplate</span> and <span style="color: #0d904f; font-family: courier;">MediaPlaybackTemplate</span>, are now available in the Car App Library 1.8 alpha release for use on Android Auto. These templates are a great fit for building <a href="https://developer.android.com/training/cars/apps/media" target="_blank">templated media apps</a>, allowing for increased customization in layout and browsing structure.</li></ul><br /><ul>
<br />
</ul></ul>
<p>On Android Auto, many new app categories and capabilities are now in beta:</p>
<ul><ul>
<li>We are adding support for <b>Building media apps</b> with the Car App Library, enabling media app developers to build both richer and more complete experiences that users are used to on their phones. During beta, developers can build and publish media apps built using the Car App Library to internal testing and closed testing tracks. You can also <a href="https://goo.gle/Media-Comms-EAP" target="_blank">express interest in being an early access partner</a> to publish to production while the category is in beta. See <a href="https://developer.android.com/training/cars/apps/media" target="_blank">Build a templated media app</a> to learn more and get started.</li></ul><br /><ul>
<li><b>The <a href="http://developer.android.com/training/cars/communications" target="_blank">communications</a> category</b> is in beta. We've simplified calling integration for calling apps by utilizing the <a href="https://developer.android.com/reference/androidx/core/telecom/CallsManager" target="_blank"><span style="font-family: courier;">CallsManager</span> Jetpack API</a>. Together with the templates provided by the Car App Library, this enables communications apps to build features like full message history, upcoming meetings list, rich in-call views, and more. During beta, developers can build and publish communications apps to internal testing and closed testing tracks. You can also <a href="http://goo.gle/Media-Comms-EAP" target="_blank">express interest in being an early access partner</a> to publish to production while the category is in beta.</li></ul><br /><ul>
<li><b>Games</b> are now supported in Android Auto, while parked, on phones running Android 15 and above. You can already find some popular titles like Angry Birds 2, Farm Heroes Saga, Candy Crush Soda Saga and Beach Buggy Racing 2. To add support for Android Auto to your own app, see <a href="https://developer.android.com/training/cars/parked/games" target="_blank">Build games for cars</a> and <a href="https://developer.android.com/training/cars/parked/auto" target="_blank">Add support for Android Auto to your parked app</a>. The Games category is in Beta and developers can publish games to internal testing and closed testing tracks. You can also <a href="http://goo.gle/Games-EAP" target="_blank">express interest in being an early access partner</a> to publish to production while the category is in beta.</li></ul><br /><ul>
</ul></ul>
<p>Finally, we have further simplified building, testing and distribution experience for developers building apps for Android Automotive OS cars with Google built-in:</p>
<ul><ul>
<li>Games Category now in Beta for Google Built-In, allowing developers to distribute their adaptive games to cars. You can <a href="http://goo.gle/Games-EAP" target="_blank">express interest to release the production track</a>. <b><a href="https://developer.android.com/training/cars/platforms/automotive-os/google-play/google-services#pgs" target="_blank">Google Play Games Services</a></b> (v2) are now available on Cars with Google built-in, enabling seamless login flows, cross device save states, and more. <a href="https://developer.android.com/games/pgs/start" target="_blank">Get started with Google Play Games Services</a> to learn more.</li></ul><br /><ul>
<li><b>Distribution through Google Play</b> is more flexible than ever. It’s now possible for apps in the <a href="https://developer.android.com/training/cars/parked" target="_blank">parked categories</a> to distribute in the same APK or App Bundle to cars with Google built-in as to phones, including through the mobile release track. Learn more on how to <a href="https://developer.android.com/training/cars/distribute#choose-track-aaos" target="_blank">Distribute to cars</a>.</li></ul><br /><ul>
<li><b><a href="https://developer.android.com/training/cars/testing/aaos-on-pixel" target="_blank">Android Automotive OS on Pixel Tablet</a></b> is now generally available, giving you a physical device option for testing Android Automotive OS apps without buying or renting a car. Additionally, the most recent system images include support for acting as an Android Auto receiver, meaning you can use the same device to test both your app’s experience on Android Auto and Android Automotive OS. <a href="http://goo.gle/Tablet-AAOS" target="_blank">Apply for access to these images</a>.</li>
</ul></ul>
<h2><span style="font-size: x-large;">The road ahead</span></h2>
<p>You can look forward to more updates later this year, including:</p>
<ul><ul>
<li><b>Video apps</b> will be supported on Android Auto, starting with phones running Android 16 on select compatible cars. If your app is already adaptive, enabling your app experience while parked only requires minimal steps to distribute to cars.</li></ul><br /><ul>
<li>For Android Automotive OS cars running Android 14+ with Google built-in, we are working with car manufacturers to add additional app compatibility, to enable thousands of adaptive mobile apps in the next phase of the <b><a href="https://developer.android.com/training/cars/car-ready-mobile-apps" target="_blank">Car Ready Mobile Apps Program</a></b>.</li></ul><br /><ul>
<li><b>Updated design documentation</b> that visualizes car app quality guidelines and integration paths to simplify designing your app for cars.</li></ul><br /><ul>
<li>Google Play Services for cars with Google built-in are expanding to bring them on-par with mobile, including:</li><ul>
a. <b>Passkeys and Credential Manager</b> APIs for a more seamless user sign-in experience.</ul><ul>
b. <b>Quick Share</b>, which will enable easy cross-device sharing from phone to car.</ul></ul><br /><ul>
<li><b>Audio while driving for video apps</b>: For cars with Google built-in, we're working with OEMs to enable audio-only listening for video apps while driving. <a href="http://goo.gle/440dHqw" target="_blank">Sign up to express interest in participating in the early access program</a>.If you’d like to prepare for this feature’s general availability, you can work through the<a href="https://developer.android.com/codelabs/audio-while-driving" target="_blank"> audio while driving codelab</a> or check out the <a href="https://developer.android.com/training/cars/parked/video" target="_blank">Build video apps for Android Automotive OS</a> page</li></ul><br /><ul>
<li><b>Firebase Test Lab</b> is adding Android Automotive OS devices to its device catalog, making it possible to test on real car hardware without needing to buy or rent a car. <a href="http://goo.gle/Firebase-for-cars" target="_blank">Sign up to express interest in becoming an early access partner</a>.</li></ul><br /><ul>
<li><b><a href="https://play.google.com/console/about/pre-launchreports/" target="_blank">Pre-launch reports</a> for Android Automotive OS</b> are coming soon to the Play Console, helping you ensure app quality before distributing your app to cars.</li>
</ul></ul>
<p>Be sure to keep up to date through goo.gle/cars-whats-new on these features and more as we continuously invest in the future of Android in the car. Stay tuned for more resources to help you build innovative and engaging experiences for drivers and passengers.</p>
<p><b>Ready to publish your car app?</b> Check our <a href="https://developer.android.com/training/cars/distribute?_gl=1*7rj7q0*_up*MQ..*_ga*OTA1MjQ0MjkuMTc0Njc5NDM3NQ..*_ga_6HH9YJMN9M*czE3NDY3OTQzNzUkbzEkZzAkdDE3NDY3OTUyMTUkajAkbDAkaDEwODQ1MjQ2NTE." target="_blank">guidance for distributing to cars</a>.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-74650059279757342492025-05-20T10:55:00.000-07:002025-05-28T15:54:42.540-07:00I/O 2025: What's new in Google Play<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjgbUsWApjG8FwIqgCf8yKKJPi5tzzhLbWPZ0J1TmGzI3cNdC3HRzD4wy0WI2a-QGNdsQgfelVe4_s5TCnY7xsgWtySClo3rvGgT0OAuSu8kOT-1ROX1DOHfJl_1npzLTXUnTpM3alzhUOkYmHWL8c-KRqVEZ498rsNh6ziidsQtrgBnKqXdz8HKs4CbuE/s1600/google-io-whats-new-in-play-meta.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjgbUsWApjG8FwIqgCf8yKKJPi5tzzhLbWPZ0J1TmGzI3cNdC3HRzD4wy0WI2a-QGNdsQgfelVe4_s5TCnY7xsgWtySClo3rvGgT0OAuSu8kOT-1ROX1DOHfJl_1npzLTXUnTpM3alzhUOkYmHWL8c-KRqVEZ498rsNh6ziidsQtrgBnKqXdz8HKs4CbuE/s1600/google-io-whats-new-in-play-meta.png" style="display: none;" />
<em>Posted by Paul Feng, VP of Product Management, Google Play</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwq-OmkcDbU1aI2R5K-5G_buGNTeuXUiJAfBFZUOLRhdxzLz2LdQWPE76VoouXbOP-qVAj2lDeD15bxlJ0dpzBIwc_Sti2gyBVCkXE6ZQ4mtr6qqhrPE10zgf3MDWFiK4wIpuf1wPb8ZOGUtfktLA2e3ySfZi8deGcwezsdKkgimM9jYY_WOu9y8Y7Q20/s1600/google-io-whats-new-in-play.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhwq-OmkcDbU1aI2R5K-5G_buGNTeuXUiJAfBFZUOLRhdxzLz2LdQWPE76VoouXbOP-qVAj2lDeD15bxlJ0dpzBIwc_Sti2gyBVCkXE6ZQ4mtr6qqhrPE10zgf3MDWFiK4wIpuf1wPb8ZOGUtfktLA2e3ySfZi8deGcwezsdKkgimM9jYY_WOu9y8Y7Q20/s1600/google-io-whats-new-in-play.png" /></a>
<p>At Google Play, we're dedicated to helping people discover experiences they'll love, while empowering developers like you to bring your ideas to life and build successful businesses.</p>
<p>At this year’s Google I/O, we unveiled the latest ways we’re empowering your success with new tools that provide robust testing and actionable insights. We also showcased how we’re continuing to build a content-rich Play Store that fosters repeat engagement alongside new subscription capabilities that streamline checkout and reduce churn.</p>
<p>Check out all exciting developments from I/O that will help you grow your business on Google Play. Continue reading or watch the session to dive in.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="POUqfDBtRbg" width="100%" height="413" src="https://www.youtube.com/embed/POUqfDBtRbg"></iframe>
<h2><span style="font-size: x-large;">Helping you succeed every step of the way</span></h2>
<p>Last month, we introduced our <a href="https://android-developers.googleblog.com/2025/04/play-console-insights.html" target="_blank">latest Play Console updates</a> focused on improving quality and performance. A redesigned app dashboard centered around four developer objectives (Test and release, Monitor and improve, Grow users, Monetize) and new Android vitals metrics offer quick insights and actionable suggestions to proactively improve the user experience.</p>
<h3><span style="font-size: large;">Get more actionable insights with new Play Console overview pages</span></h3>
<p>Building on these updates, we've launched dedicated overview pages for two developer objectives: Test and release and Monitor and improve. These new pages bring together more objective-related metrics, relevant features, and a "Take action" section with contextual, dynamic advice. Overview pages for Grow and Monetize will be coming soon.</p>
<h3><span style="font-size: large;">Halt fully-rolled out releases when needed</span></h3>
<p>Historically, a release at 100% live meant there was no turning back, leaving users stuck with a flawed version until a new update rolled out. Soon, you'll be able to halt fully-live releases, through Play Console and the Publishing API to stop the distribution of problematic versions to new users.</p>
<br />
<h3><span style="font-size: large;">Optimize your store listings with better management tools and metrics</span></h3>
<p>We launched two tools to enhance your store listings. The asset library makes it easy to upload, edit, and view your visual assets. Upload them from Google Drive, organize with tags, and crop for repurposing. And with new open metrics, you <a href="https://support.google.com/googleplay/android-developer/answer/9859173?hl=en&sjid=14407705712869133246-EU#zippy=%2Cmetrics" target="_blank">gain deeper insights</a> into listing performance so you can better understand how they attract, engage, and re-engage users.</p>
<div id="unlocking-discovery-engagement"><a href="#unlocking-discovery-engagement"></a></div>
<h3><span style="font-size: large;">Stay ahead of threats with the Play Integrity API</span></h3>
<p>We're committed to robust security and preventing abuse so you can thrive on Play’s trusted platform. The <a href="https://developer.android.com/google/play/integrity" target="_blank">Play Integrity API</a> continuously evolves to combat emerging threats, with these recent enhancements:</p>
<ul><ul>
<li><b>Stronger abuse detection for all developers</b> that leverages the latest Android hardware-security with no developer effort required.</li>
<li><b>Device security update checks</b> to safeguard your app’s sensitive actions like transfers or data access.</li>
<li><b>Public beta for device recall</b> which enables you to detect if a device is being reused for abuse or repeated actions, even after a device reset. You can <a href="https://goo.gle/play-device-recall" target="_blank">express interest in this beta</a>.</li>
</ul></ul>
<h2><span style="font-size: x-large;">Unlocking more discovery and engagement for your apps and its content</span></h2>
<p>Last year, we <a href="https://android-developers.googleblog.com/2024/05/io-24-whats-new-in-google-play.html" target="_blank">shared our vision</a> for a content-rich Google Play that has already delivered strong results. Year-over-year, Apps Home has seen over a 25% increase in average monthly visitors with apps seeing a 10% growth in acquisitions and double-digit growth in app spend for those monetizing on Google Play. Building on that vision, we're introducing even more updates to elevate your discovery and engagement, both on and off the store.</p>
<p>For example, <b>curated spaces</b>, launched last year, celebrate seasonal interests like football (soccer) in Brazil and cricket in India, and evergreen interests like comics in Japan. By adding daily content—match highlights, promotions, and editorial articles directly on the Apps Home—these spaces foster discovery and engagement. Curated spaces are a hit with over 920,000 highly engaged users in Japan returning to the comics space monthly. Building on this momentum, we are expanding to more locations and categories this year.</p>
<br />
<p>We're launching <b>new topic browse pages</b> that feature timely, relevant, and visually engaging content. Users can find them throughout the Store, including Apps Home, store listing pages, and search. These pages debut this month in the US with Media & Entertainment, showcasing over 100,000 shows, movies, and select sports. More localized topic pages will roll out globally later this year.</p>
<br />
<p>We’re expanding <b>Where to Watch</b> to more markets, including the UK, Korea, Indonesia, and Mexico, to help users find and deep-link directly into their subscribed apps for movies and TV. Since launching in the US in November 2024, we've seen promising results: People who view app content through Where to Watch return to Play more frequently and increase their content search activity by 30%.</p>
<p>We're also enhancing how your content is displayed on the Play Store. Starting this July, all app developers can add a <b>hero content carousel</b> and a <b>YouTube playlist carousel</b> to their store listings. These formats will help showcase your best content and drive greater user engagement and discovery.</p>
<p>For apps best experienced through sound, we're launching <b>audio samples</b> on the Apps Home. A simple tap offers users a brief escape into your audio content. In early testing, audio samples made users 3x more likely to install or open an app! This feature is now available for all Health & Wellness app developers with users in the US, with more categories and markets coming soon. You can <a href="https://goo.gle/play-audio-samples" target="_blank">express your interest in promoting audio content</a>.</p>
<br />
<h2><span style="font-size: x-large;">Helping you take advantage of deeper engagement on Play, on and off the Store</span></h2>
<p>Last year, we introduced <b><a href="https://goo.gle/play-engagesdk" target="_blank">Engage SDK</a></b>, a unified solution to deliver personalized content and guide users to relevant in-app experiences. Integrating it unlocks surfaces like Collections, our immersive full-screen experience bringing content directly to the user's home screen.</p>
<p>We're rolling out updates to expand your content’s reach even further:</p>
<ul><ul>
<li><b>Engage SDK content is coming to the Play Store this summer</b>, in addition to existing spaces like Collections and Entertainment Space on select Android tablets.</li>
<li><b>New content categories</b> are now supported, starting today with Travel.</li>
<li><b>Collections are rolling out globally</b> to Google Play markets starting today, including Brazil, India, Indonesia, Japan, and Mexico.</li>
</ul></ul>
<p>Integrate with Engage SDK today to take advantage of this new expansion and boost re-engagement. <a href="https://developer.android.com/codelabs/engage-sdk-codelab" target="_blank">Try our codelab</a> to test the ease of publishing content with Engage SDK and <a href="https://support.google.com/googleplay/contact/Engage_SDK_Developer_Preview" target="_blank">express interest in the developer preview</a>.</p>
<br />
<h2><span style="font-size: x-large;">Maximizing your revenue with subscriptions enhancements</span></h2>
<p>With over a quarter-billion subscriptions, Google Play is one of the world's largest subscriptions platforms. We're committed to helping you turn engaged users into revenue growth by continually enhancing our tools to meet evolving customer needs.</p>
<p>To streamline your purchase flow, we’re introducing <b>multi-product checkout for subscriptions</b>. This lets you sell subscription add-ons alongside base subscriptions, all under a single, aligned payment schedule. Users get a simplified experience with one price and one transaction, while you gain more control over how subscribers upgrade, downgrade, or manage their add-ons.</p>
<br />
<p>To help you retain more of your subscribers, we’re now <b>showcasing subscription benefits in more places across Play</b> – including the Subscriptions Center, in reminder emails, and during purchase and cancellation flows. This increased visibility has already reduced voluntary churn by 2%. Be sure to enter your subscription benefits in Play Console so you can leverage this powerful new capability.</p>
<br />
<p>Reducing involuntary churn is a key factor in optimizing your revenue. When payment methods unexpectedly decline, users might unintentionally cancel. Now, instead of immediate cancellation, you can now choose a <b>grace period</b> (up to 30 days) or an <b>account hold</b> (up to 60 days). Developers who increased the decline recovery period – from 30 to 60 days – saw an average 10% reduction in involuntary churn for renewals.</p>
<p>On top of this, we're expanding <b>our commitment to get more buyers ready for purchases</b> throughout their entire journey. This includes prompting users to set up payment methods and verification right at device setup. After setup, we've integrated prompts into highly visible areas like the Play and Google account menus. And as always, we’re continuously enabling payments in more markets and expanding payment options. Plus, our AI models now help optimize in-app transactions by suggesting the right payment method at the right time, and we're bringing buyers back with effective cart abandonment reminders.</p>
<h2><span style="font-size: x-large;">Grow your business on Google Play</span></h2>
<p>Our latest updates reinforce our commitment to fostering a thriving Google Play ecosystem. From enhanced discovery and robust tools to new monetization avenues, we're empowering you to innovate and grow. We're excited for the future we're building together and encourage you to use these new capabilities to create even more impactful experiences. Thank you for being an essential part of the Google Play community.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br /><p></p><p></p><p></p><p></p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-41382773486629451082025-05-20T10:54:00.000-07:002025-05-20T12:01:02.697-07:00In-App Ratings and Reviews for TV<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgI8u5ef64SB3dtNYxWNaem2LQHIQ6T22U2ZMN1i_GSSxJEYSkjzI4YU4dhRKq6Vd35DBr05KBbUNksNEYKQsoLu2ue1vPqtPm4uyTVV2wYJhnm3gMZki1AVtUniuyi7G1KpkBDkTfVpn3ZvgJi3oavdrDdOHV9NN2iLCWRoBt2DkDEWLrCcBqDZa7Cv64/s1600/In-App%20Ratings%20and%20Reviews%20for%20TV.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgI8u5ef64SB3dtNYxWNaem2LQHIQ6T22U2ZMN1i_GSSxJEYSkjzI4YU4dhRKq6Vd35DBr05KBbUNksNEYKQsoLu2ue1vPqtPm4uyTVV2wYJhnm3gMZki1AVtUniuyi7G1KpkBDkTfVpn3ZvgJi3oavdrDdOHV9NN2iLCWRoBt2DkDEWLrCcBqDZa7Cv64/s1600/In-App%20Ratings%20and%20Reviews%20for%20TV.png" style="display: none;" />
<em>Posted by Paul Lammertsma – Developer Relations Engineer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3Mu2hOT6Ypnj7GLjRXlD0XgJZuVsNNY2KoSOchckvixh8dx70BhizYxxtH2hrjFwUHJhobtjhTz_cBCZJnk-56pn6NbAiI-dOhC6lqgTf5Ug40AKpOi41sOyIJeqoX2H3At6NHi4BPQx9jRvo8nzv_-DgWQopHlkJZFncNMrzozio0cJq0GpUsxVgxdQ/s1600/O25-BHero-Android-8.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh3Mu2hOT6Ypnj7GLjRXlD0XgJZuVsNNY2KoSOchckvixh8dx70BhizYxxtH2hrjFwUHJhobtjhTz_cBCZJnk-56pn6NbAiI-dOhC6lqgTf5Ug40AKpOi41sOyIJeqoX2H3At6NHi4BPQx9jRvo8nzv_-DgWQopHlkJZFncNMrzozio0cJq0GpUsxVgxdQ/s1600/O25-BHero-Android-8.png" /></a>
<p>Ratings and reviews are essential for developers, offering quantitative and qualitative feedback on user experiences. In 2022, we enhanced the granularity of this feedback by <a href="https://android-developers.googleblog.com/2021/08/making-ratings-and-reviews-better-for.html" target="_blank">segmenting these insights by countries and form factors</a>.</p>
<p>Now, we're extending the In-App Ratings and Reviews API to TV to allow developers to prompt users for ratings and reviews directly from Google TV.</p>
<h2><span style="font-size: x-large;">Ratings and reviews on Google TV</span></h2>
<br />
<p>Users can now see rating averages, browse reviews, and leave their own review directly from an app's store listing on Google TV.</p>

<p>Users can interact with in-app ratings and reviews on their TVs by doing the following:</p>
<ul><ul>
<li>Select ratings using the remote control D-pad.</li>
<li>Provide optional written reviews using Gboard’s on-screen voice input, or by easily typing from their phone.</li>
<li>Send mobile notifications to themselves to complete their TV app review directly on their phone.</li>
</ul></ul><br />
<br />
<p>Additionally, users can leave reviews for other form factors directly from their phone by simply selecting the device chip when submitting an app rating or writing a review.</p>
<p>We've already seen a considerable lift in app ratings on TV since bringing these changes to Google TV, and now, we're making it possible for developers to trigger a ratings prompt as well.</p>
<p>Before we look at the integration, let's first carefully consider the best time to request a review prompt. First, identify optimal moments within your app to request user feedback, ensuring prompts appear only when the UI is idle to prevent interruption of ongoing content.</p>
<h2><span style="font-size: x-large;">In-App Review API</span></h2>
<p>Integrating the <a href="https://developer.android.com/guide/playcore/in-app-review" target="_blank">Google Play In-App Review API</a> is the same as on mobile and it's only a couple of method calls:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> manager = ReviewManagerFactory.create(context)
manager.requestReviewFlow().addOnCompleteListener { task ->
<span style="color: green; font-weight: bold;">if</span> (task.isSuccessful) {
<span style="color: #408080; font-style: italic;">// We got the ReviewInfo object</span>
<span style="color: green; font-weight: bold;">val</span> reviewInfo = task.result
manager.launchReviewFlow(activity, reviewInfo)
} <span style="color: green; font-weight: bold;">else</span> {
<span style="color: #408080; font-style: italic;">// There was some problem, log or handle the error code</span>
@ReviewErrorCode <span style="color: green; font-weight: bold;">val</span> reviewErrorCode =
(task.getException() <span style="color: green; font-weight: bold;">as</span> ReviewException).errorCode
}
}
</pre></div><br />
<p>First, invoke <span style="font-family: courier;"><a href="https://developer.android.com/reference/com/google/android/play/core/review/ReviewManager.html#requestReviewFlow()" target="_blank">requestReviewFlow()</a></span> to obtain a <span style="color: #0d904f; font-family: courier;">ReviewInfo</span> object which is used to launch the review flow. You must include an <span style="color: #0d904f; font-family: courier;"><a href="https://developers.google.com/android/reference/com/google/android/gms/tasks/Task#addOnCompleteListener%28com.google.android.gms.tasks.OnCompleteListener%3CTResult%3E%29" target="_blank">addOnCompleteListener()</a></span> not just to obtain the <span style="color: #0d904f; font-family: courier;">ReviewInfo</span> object, but also to monitor for any problems triggering this flow, such as the unavailability of Google Play on the device. Note that <span style="color: #0d904f; font-family: courier;">ReviewInfo</span> does not offer any insights on whether or not a prompt appeared or which action the user took if a prompt did appear.</p>
<p>The challenge is to identify <i>when</i> to trigger <span style="font-family: courier;"><a href="https://developer.android.com/reference/com/google/android/play/core/review/ReviewManager.html#launchReviewFlow(android.app.Activity,%20com.google.android.play.core.review.ReviewInfo)" target="_blank">launchReviewFlow()</a></span>. Track user actions—identifying successful journeys and points where users encounter issues—so you can be confident they had a delightful experience in your app.</p>
<p>For this method, you may optionally also include an <span style="color: #0d904f; font-family: courier;">addOnCompleteListener()</span> to ensure it resumes when the returned task is completed.</p>
<p>Note that due to throttling of how often users are presented with this prompt, there are no guarantees that the ratings dialog will appear when requesting to start this flow. For best practices, check <a href="https://developer.android.com/guide/playcore/in-app-review#when-to-request" target="_blank">this guide on when to request an in-app review</a>.</p>
<h2><span style="font-size: x-large;">Get started with In-App Reviews on Google TV</span></h2>
<p>You can get a head start today by following these steps:</p>
<ul><ol>
<li>Identify <i>successful journeys</i> for users, like finishing a movie or TV show season.</li>
<li>Identify poor experiences that should be avoided, like buffering or playback errors.</li>
<li>Integrate the <a href="https://developer.android.com/guide/playcore/in-app-review" target="_blank">Google Play In-App Review API</a> to trigger review requests at optimal moments within the user journey.</li>
<li>Test your integration by following <a href="https://developer.android.com/guide/playcore/in-app-review/test" target="_blank">the testing guide</a>.</li>
<li>Publish your app and continuously monitor <a href="https://play.google.com/console/app/user-feedback/ratings-breakdown?dimension=deviceType&ratingsThreshold=ANY_NUMBER&timeRange=LAST_TWENTY_EIGHT_DAYS" target="_blank">your ratings by device type</a> in the Play Console.</li>
</ol></ul>
<p>We're confident this integration enables you to elevate your Google TV app ratings and empowers your users to share valuable feedback.</p>

<h3><span style="font-size: large;">Resources</span></h3>
<ul><ul>
<li><a href="https://developer.android.com/guide/playcore/in-app-review" target="_blank">Google Play In-App Review API</a></li>
<li><a href="https://play.google.com/console/app/user-feedback/ratings-breakdown?dimension=deviceType&ratingsThreshold=ANY_NUMBER&timeRange=LAST_TWENTY_EIGHT_DAYS" target="_blank">App ratings by device type</a> in your Play Console</li>
</ul></ul>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-13579673315525457712025-05-20T10:53:00.000-07:002025-05-20T15:54:14.364-07:00Google I/O 2025: What’s new in Android development tools<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjAOpioaQdivubxYGDYFmFGdwUgSL93ZOD89pFnWFDBEd5xJnbI1yVw4H-lk1HPMVb_hnf-L8JSSLFwawVGGWXSsqfY34ZsIqZAYdLFjbjLSk77LMpA1UuSt3IVJMTc7W0QNlYW7jY2bhyphenhyphenH6mTBRH34JKQ4wUpkM7IhxIPDBgmUOuaXXpkYinDEXndvUIc/s1600/android-development-tools-google-io.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjAOpioaQdivubxYGDYFmFGdwUgSL93ZOD89pFnWFDBEd5xJnbI1yVw4H-lk1HPMVb_hnf-L8JSSLFwawVGGWXSsqfY34ZsIqZAYdLFjbjLSk77LMpA1UuSt3IVJMTc7W0QNlYW7jY2bhyphenhyphenH6mTBRH34JKQ4wUpkM7IhxIPDBgmUOuaXXpkYinDEXndvUIc/s1600/android-development-tools-google-io.png" style="display: none;" />
<em>Posted by <a href="https://twitter.com/makuchaku" target="_blank">Mayank Jain</a> – Product Manager, Android Studio</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPmG2XrQ2m8_zpYT6Ryy1FkM4p3KWEGgwSlmLuL7AKfDBXjgfPodSwDK4J4EMNK5Z04DOHUGeRZB6HzV9B66133xnCxVHCfe0edzyzuvZPcc9Dmmh-gbX5NdQ8qDcMcYKUNz-YKe8VuVJ70nY_A7CbwaX_gUZMVDtQxiU4v4iDd2CorgJjj1b1jiqMDHE/s1600/O25-BHero-Android-8.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiPmG2XrQ2m8_zpYT6Ryy1FkM4p3KWEGgwSlmLuL7AKfDBXjgfPodSwDK4J4EMNK5Z04DOHUGeRZB6HzV9B66133xnCxVHCfe0edzyzuvZPcc9Dmmh-gbX5NdQ8qDcMcYKUNz-YKe8VuVJ70nY_A7CbwaX_gUZMVDtQxiU4v4iDd2CorgJjj1b1jiqMDHE/s1600/O25-BHero-Android-8.png" /></a>
<p>Android Studio continues to advance Android development by empowering developers to build better app experiences, faster. Our focus has been on improving AI-driven functionality with Gemini, streamlining UI creation and testing, and helping you future-proof apps for the evolving Android ecosystem. These innovations accelerate development cycles, improve app quality, and help you stay ahead in the fast-paced world of mobile development.</p>
<p>You can check out the <a href="https://io.google/2025/explore/pa-keynote-5" target="_blank">What’s new in Android Developer Tools</a> session at Google I/O 2025 to see some of the new features in action or better yet, try them out yourself by downloading <a href="https://developer.android.com/studio/preview" target="_blank">Android Studio Narwhal Feature Drop (2025.2.1) in the preview release channel</a>. Here’s a look at our latest developments:</p>
<h2><span style="font-size: x-large;">Get the latest Gemini 2.5 Pro model in Android Studio</span></h2>
<p>The power of artificial intelligence through Gemini is now deeply integrated into Android Studio, helping you at all stages of Android app development. Now <b>with access to Gemini 2.5 Pro</b>, we're continuing to look for new ways to use AI to supercharge Android development — and help you build better app experiences, faster.</p>
<h3><span style="font-size: large;">Journeys for Android Studio</span></h3>
<p>We’re also introducing agentic AI with Gemini in Android Studio.Testing your app is now much easier when you create journeys - just describe the actions and assertions in natural language for the user journeys you want to test, and Gemini performs the tests for you. Creating journeys lets you test your app’s critical user journeys across various devices without writing extensive code. You can then run these tests on local physical or virtual Android devices to validate that the test worked as intended by reviewing detailed results directly within the IDE. Although the feature is experimental, the goal is to increase the speed that you can ship high-quality code, while significantly reducing the amount of time you spend manually testing, validating, or reproducing issues.</p>
<br />
<iframe allowfullscreen="" class="BLOG_video_class" height="413" src="https://www.youtube.com/embed/mP1tlIKK0R4" width="100%" youtube-src-id="mP1tlIKK0R4"></iframe><br />
<h3><span style="font-size: large;">Suggested fixes for crashes with Gemini</span></h3>
<p>The App Quality Insights panel has a great new feature. The crash insights now analyzes your app's source code referenced from the crash, and not only offers a comprehensive analysis and explanation of the crash, in some cases it even offers a source fix! With just a few clicks, you are able to review the changes, accept the code suggestions, and push the changes to your source control. Now you can determine the root cause of a crash and fix it much faster!</p>
<br />
<h3><span style="font-size: large;">AI features in Studio Labs (stable releases only)</span></h3>
<p>We’ve heard feedback that developers want to access AI features in stable channels as soon as possible. You can now discover and try out the latest AI experimental features through the <b>Studio Labs</b> menu in the Settings menu starting with Narwhal stable release. You can get a first look at AI experiments, share your feedback, and help us bring them into the IDE you use everyday. Go to the Studio Labs tab in Settings and enable the features you would like to start using. These AI features are automatically enabled in canary releases and no action is required.</p>
<br />
<ul><ul>
<li><h4><span style="font-size: medium;"><b>Compose preview generation with Gemini</b></span></h4></li>
<p>Gemini can automatically generate Jetpack Compose preview code saving you time and effort. You can access this feature by right-clicking within a composable and navigating to <b>Gemini > Generate Compose Preview</b> or <b>Generate Compose Preview for this file</b>, or by clicking the link in an empty preview panel. The generated preview code is presented in a diff view that enables you to quickly accept, edit, or reject the suggestions, providing a faster way to visualize your composables.</p>
<br />
<li><h4><span style="font-size: medium;"><b>Transform UI with Gemini</b></span></h4></li>
<p>You can now transform UI code within the Compose Preview environment using natural language directly in the preview. To use it, right click in the Compose Preview and select "Transform UI With Gemini". Then enter your natural language requests, such as "Center align these buttons," to guide Gemini in adjusting your layout or styling, or select specific UI elements in the preview for better context. Gemini will then edit your Compose UI code in place, which you can review and approve, speeding up the UI development workflow.</p>
<br />
<li><h4><span style="font-size: medium;"><b>Image attachment in Gemini</b></span></h4></li>
<p>You can now attach image files and provide additional information along with your prompt. For example: you can attach UI mock-ups or screenshots to tell Gemini context about your app’s layout. Consequently, Gemini can generate Compose code based on a provided image, or explain the composables and data flow of a UI screenshot. </p>
<br />
<li><h4><span style="font-size: medium;"><b>@File context in Gemini</b></span></h4></li>
<p> You can now attach your project files as context in chat interactions with Gemini in Android Studio. This lets you quickly reference files in your prompts for Gemini. In the Gemini chat input, type @ to bring up a file completion menu and select files to attach. You can also click the <b>Context</b> drop-down to see which files were automatically attached by Gemini. This gives you more control over the context sent to Gemini.</p>
<br />
</ul></ul>
<h3><span style="font-size: large;">Rules in Prompt Library</span></h3>
<p><b>Rules in Gemini</b> let you define preferred coding styles or output formats within the Prompt Library. You can also mention your preferred tech stack and languages. When you set these preferences once, they are automatically applied to all subsequent prompts sent to Gemini. Rules help the AI understand project standards and preferences for more accurate and tailored code assistance. For example, you can create a rule such as “Always give me concise responses in Kotlin.”</p>
<br />
<h3><span style="font-size: large;">Gemini in Android Studio for businesses</span></h3>
<p><a href="https://developer.android.com/gemini-for-businesses" target="_blank">Gemini in Android Studio for businesses</a> is now available. It provides all the benefits of Gemini in Android Studio, plus <b>enterprise-grade privacy and security features</b> backed by Google Cloud — giving your team the confidence they need to deploy AI at scale while keeping their data protected.</p>
<p>Developers and admins can unlock these features and benefits by subscribing to <a href="https://codeassist.google/products/business" target="_blank">Gemini Code Assist</a> Standard or Enterprise editions. Discover the <a href="https://android-developers.googleblog.com/2025/04/gemini-in-android-studio-for-business.html" target="_blank">full list of Gemini in Android for business features available for your organization</a>.</p>
<h2><span style="font-size: x-large;">Improved tools for creating great user experiences</span></h2>
<p>Elevate your Compose UI development with the latest Android Studio enhancements.</p>
<h3><span style="font-size: large;">Compose preview improvements</span></h3>
<p>Compose preview interaction is now more efficient with the latest navigation improvements. Click on the preview name to jump to the preview definition or click the individual component to jump to the function where it’s defined. Hover states provide immediate visual feedback as you mouse over a preview frame. Improved keyboard arrow navigation eases movement through multiple previews, enabling faster UI iteration and refinement. Additionally, the Compose preview picker is now also available in the stable release.</p>
<br />
<br />
<h3><span style="font-size: large;">Resizable Previews</span></h3>
<p>While in Compose Preview’s focus mode in Android Studio, you can now resize the preview window by dragging its edges. This gives you instant visual feedback on how your UI adapts to different screen sizes, ensuring responsiveness and visual consistency. This rapid iteration helps create UIs that look great on any Android device.</p>
<br />
<h3><span style="font-size: large;">Embedded Android XR Emulator</span></h3>
<p>The Android XR Emulator now launches by default in the embedded state. You can now deploy your application, navigate the 3D space and use the Layout Inspector directly inside Android Studio, streamlining your development flow.
</p>
<br />
<h2><span style="font-size: x-large;">Improved tools for future-proofing and testing your Android apps</span></h2>
<p>We’ve enhanced some of your favorite features so that you can test more confidently, future-proof your apps, and ensure app compatibility across a wide range of devices and Android versions.</p>
<h3><span style="font-size: large;">Streamlined testing with Backup and Restore support</span></h3>
<p>Android Studio offers built-in Backup and Restore support by letting you trigger app backups on connected devices directly from the <b>Running Devices</b> window. You can also configure your <b>Run/Debug</b> settings to automatically restore from a previous backup when launching your app. This simplifies the process of validating your app's Backup and Restore implementation and speeds up development by reducing manual setup for testing.</p>
<br />
<h3><span style="font-size: large;">Android’s transition to 16 KB Page Size</span></h3>
<p>The underlying architecture of Android is evolving, and a key step forward is the transition to <b>16 KB page sizes</b>. This fundamental change requires all Android apps with native code or dependencies to be recompiled for compatibility. To help you navigate this transition smoothly, Android Studio now offers proactive warnings when building APKs or Android App Bundles that are incompatible with 16 KB devices. Using the APK Analyzer, you can also find out which libraries are incompatible with 16 KB devices. To test your apps in this new environment, a dedicated 16 KB emulator target is also available in Android Studio alongside existing 4 KB images.</p>
<br />
<h3><span style="font-size: large;">Backup and Sync your Studio settings</span></h3>
<p>When you sign in with your Google account or a JetBrains account in Android Studio, you can now sync your customizations and preferences across all installs and restore preferences automatically on remote Android Studio instances. Simply select “Enable Backup and Sync” while you’re logging in to Android Studio, or from the Settings > Backup and Sync page, and follow the prompts.</p>
<br />
<h3><span style="font-size: large;">Increasing developer productivity with Android’s Kotlin Multiplatform improvements</span></h3>
<p>Kotlin Multiplatform (KMP) enables teams to reach new audiences across Android and iOS with less development time. Usage has been growing in the developer community, with apps such as Google Docs now using it in production. We’ve released new Android Studio <a href="https://developer.android.com/kotlin/multiplatform/migrate" target="_blank">KMP project templates</a>, updated <a href="https://developer.android.com/kotlin/multiplatform" target="_blank">Jetpack libraries</a> and new codelabs (<a href="https://developer.android.com/codelabs/kmp-get-started" target="_blank">Get Started with KMP</a> and <a href="https://developer.android.com/codelabs/kmp-migrate-room" target="_blank">Migrate Existing Apps to Room KMP</a>) to help developers who are looking to get started with KMP.</p>
<h2><span style="font-size: x-large;">Experimental and features that are coming soon to Android Studio</span></h2>
<h3><span style="font-size: large;">Android Studio Cloud (experimental)</span></h3>
<p><a href="https://developer.android.com/studio/preview/android-studio-cloud" target="_blank">Android Studio Cloud</a> is now available as an experimental public preview, accessible through <a href="https://studio.firebase.google.com/?e=MonospaceEnabledFeaturesCapraBuild::Launch::Enrolled,DeploymentAndObservabilityFeatureLaunch::Launch::Enrolled,MonospaceEnabledFeaturesProPlan::Launch::Enrolled" target="_blank">Firebase Studio</a>. This service streams a Linux virtual machine running Android Studio directly to your web browser, enabling Android application development from anywhere with an internet connection. Get started quickly with dedicated workspaces featuring pre-downloaded Android SDK components. Explore sample projects or seamlessly access your existing Android app projects from GitHub without a local installation. Please note that Android Studio Cloud is currently in an experimental phase. Features and capabilities are subject to significant change, and users may encounter known limitations.</p>
<br />
<h3><span style="font-size: large;">Version Upgrade Agent (coming soon)</span></h3>
<p>The <b>Version Upgrade Agent</b>, as part of Gemini in Android Studio, is designed to save you time and effort by automating your dependency upgrades. It intelligently analyzes your Android project, parses the release notes for included libraries, and proposes updates directly from your <span style="color: #0d904f; font-family: courier;">libs.versions.toml</span> file or the refactoring menu (<b>right-click > Refactor > Update dependencies</b>). The agent automatically updates dependencies to the latest compatible version, builds the project, fixes any errors, and repeats until all errors are fixed. Once the dependencies are upgraded, the agent generates a report showing the changes it made, as well as a high level summary highlighting the changes included in the updated libraries.</p>
<br />
<iframe allowfullscreen="" class="BLOG_video_class" height="413" src="https://www.youtube.com/embed/ubyPjBesW-8" width="100%" youtube-src-id="ubyPjBesW-8"></iframe>
<h3><span style="font-size: large;">Agent Mode (coming soon)</span></h3>
<p>Agent Mode is a new autonomous AI feature using Gemini, designed to handle complex, multi-stage development tasks that go beyond typical AI assistant capabilities, invoking multiple tools to accomplish tasks on your behalf.</p>
<p>You can describe a complex goal, like integrating a new API, and the agent will formulate an execution plan that spans across files in your project — adding necessary dependencies, editing files, and iteratively fixing bugs. This feature aims to empower all developers to tackle intricate challenges and accelerate the building and prototyping process. You can access it via the Gemini chat window in Android Studio.</p>
<a href="#play-policy-insights-beta"></a><br />
<h3 id="play-policy-insights-beta"><span style="font-size: large;">Play Policy Insights beta in Android Studio (coming soon)</span></h3>
<p>Android Studio now includes richer insights and guidance on Google Play policies that might impact your app. This information, available as lint checks, helps you build safer apps from the start, preventing issues that could disrupt your launch process and cost more time and resources to fix later on. These lint checks will present an overview of the policy, do and don’ts, and links to Play policy pages where you can find more information about the policy.</p>
<br />
<h2><span style="font-size: x-large;">IntelliJ Platform Update (2025.1)</span></h2>
<p>Here are some important IDE improvements in the IntelliJ IDEA 2025.1 platform release</p>
<ul><ul>
<li><b>Kotlin K2 mode:</b> Android Studio now supports Kotlin K2 mode in Android-specific features requiring language support such as Live Edit, Compose Preview and many more</li></ul><br /><ul>
<li><b>Improved dependency resolution in Kotlin build scripts:</b> Makes your Kotlin build scripts for Android projects more stable and predictable</li></ul><br /><ul>
<li><b>Hints about code alterations by Kotlin compiler plugins:</b> Gives you clearer insights into how plugins used in Android development modify your Kotlin code</li></ul><br /><ul>
<li><b>Automatic download of library sources for Gradle projects:</b> Simplifies debugging and understanding your Android project dependencies by providing immediate access to their source code</li></ul><br /><ul>
<li><b>Support for Gradle Daemon toolchains:</b> Helps prevent potential JVM errors during your Android project builds and ensures smoother synchronization</li></ul><br /><ul>
<li><b>Automatic plugin updates:</b> Keeps your Android development tools within IntelliJ IDEA up-to-date effortlessly</li>
</ul></ul>
<h2><span style="font-size: x-large;">To Summarize</span></h2>
<p>Android Studio Narwhal Feature Drop (2025.2.1) is now available in the Android Studio canary channel with some amazing features to help your Android development</p>
<h3><span style="font-size: large;">AI-powered development tools for Android</span></h3>
<ul><ul>
<li><b>Journeys for Android Studio:</b> Validate app flows easily using tests and assertions in natural language</li>
<li><b>Suggested fixes for crashes with Gemini:</b> Determine the root cause of a crash and fix it much faster with Gemini</li>
<li><b>AI features in Studio Labs</b></li>
<ul><ul>
<li><b>Compose preview generation with Gemini:</b> Generate Compose previews with Gemini's code suggestions</li>
<li><b>Transform UI with Gemini:</b> Transform UI in Compose Preview with natural language, speeding development</li>
<li><b>Image attachment in Gemini:</b> Attach images to Gemini for context-aware code generation</li>
<li><b>@File context in Gemini:</b> Reference project files in Gemini chats for quick AI prompts</li>
</ul></ul>
<li><b>Rules in Prompt Library:</b> Define preferred coding styles or output formats within the Prompt Library
</li></ul></ul>
<h3><span style="font-size: large;">Improved tools for creating great user experiences</span></h3>
<ul><ul>
<li><b>Compose preview improvements:</b> Navigate the Compose Preview using clickable names and components</li>
<li><b>Resizable preview:</b> Instantly see how your Compose UI adapts to different screen sizes</li>
<li><b>Embedded XR Emulator:</b> XR Emulator now launches by default in the embedded state</li>
</ul></ul>
<h3><span style="font-size: large;">Improved tools for future-proofing and testing your Android apps</span></h3>
<ul><ul>
<li><b>Streamlined testing with Backup and Restore support:</b> Effortless app testing, trigger backups, auto-restore for faster validation</li>
<li><b>Android’s transition to 16 KB Page Size:</b> Prepare for Android's 16KB page size with Studio's early warnings and testing</li>
<li><b>Backup and Sync your Studio settings:</b> Sync Android Studio settings across devices and restore automatically for convenience</li>
<li><b>Increasing developer productivity with Android’s Kotlin Multiplatform improvements:</b> simplified cross-platform Android and iOS development with new tools</li>
</ul></ul>
<h3><span style="font-size: large;">Experimental and features that are coming soon to Android Studio</span></h3>
<ul><ul>
<li><b>Android Studio Cloud (experimental):</b> Develop Android apps from any browser with just an internet connection</li>
<li><b>Version Upgrade Agent (coming soon):</b> Automated dependency updates save time and effort, ensuring projects stay current</li>
<li><b>Agent Mode (coming soon):</b> Empowering developers to tackle multistage complex tasks that go beyond typical AI assistant capabilities</li>
<li><b>Play Policy Insights beta in Android Studio (coming soon):</b> Insights and guidance on Google Play policies that might impact your app</li>
</ul></ul>
<h2><span style="font-size: x-large;">How to get started</span></h2>
<p>Ready to try the exciting new features in Android Studio?</p>
<p>You can download the canary version of <a href="https://developer.android.com/studio/preview" target="_blank"><b>Android Studio Narwhal</b></a> Feature Drop (2025.1.2) today to incorporate these new features into your workflow or try the latest AI features using Studio Labs in the stable version of <a href="https://developer.android.com/studio" target="_blank"><b>Android Studio Meerkat</b></a>. You can also <a href="https://developer.android.com/studio/preview/install-preview" target="_blank">install them side by side by following these instructions</a>.</p>
<p>As always, your feedback is important to us – <a href="https://developer.android.com/studio/known-issues" target="_blank">check known issues</a>, <a href="https://developer.android.com/studio/report-bugs" target="_blank">report bugs</a>, <a href="https://developer.android.com/studio/report-bugs" target="_blank">suggest improvements</a>, and be part of our vibrant community on <a href="https://www.linkedin.com/showcase/androiddev/posts/?feedView=all" target="_blank">LinkedIn</a> <a href="https://medium.com/androiddevelopers" target="_blank">Medium</a>, <a href="https://www.youtube.com/c/AndroidDevelopers/videos" target="_blank">YouTube</a>, or <a href="https://twitter.com/androidstudio" target="_blank">X</a>. Let's build the future of Android apps together!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-87008238850631164322025-05-20T10:52:00.000-07:002025-05-23T10:35:44.318-07:00Engage users on Google TV with excellent TV apps<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBD5VvvBuR8AfswZAlwvboh4l2DzgISOScKiZnoprl7W4QhdVYlKTMWt2KVIVr0jHZNRsQAGMg1qQjuDb1BUImQMroMyuDvr0TzgcWSYKVp36R08V7IszIxO75_uKInNJshrsXkKau05Tfza7uHJWzwuHZesH2hPzYBL6GVdf0HLPZZUL6ITMW7ZgLfOQ/s1600/gemini-google-tv-google-io-wicked.gif" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjBD5VvvBuR8AfswZAlwvboh4l2DzgISOScKiZnoprl7W4QhdVYlKTMWt2KVIVr0jHZNRsQAGMg1qQjuDb1BUImQMroMyuDvr0TzgcWSYKVp36R08V7IszIxO75_uKInNJshrsXkKau05Tfza7uHJWzwuHZesH2hPzYBL6GVdf0HLPZZUL6ITMW7ZgLfOQ/s1600/gemini-google-tv-google-io-wicked.gif" style="display: none;" />
<em>Posted by Shobana Radhakrishnan - Senior Director of Engineering, Google TV, and Paul Lammertsma - Developer Relations Engineer, Android</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKjPUceDGVyuOm8Dvw_kMpTe5ArHNAYkWBvE0nNNktZiJJt8Lo45Mc4Gj827Y8M8W2hXa2xrRc1OeutuX_PxzcyobyNgzQ4tAyvPOK7j4A_MpynGEss5zJReG2gdZyo6QzsXa5ll77x6BUJjY5MdHBx1AEbDBnpt7gu_VvW8jjhQscpNNvo1QcjkGRWaA/s1600/engaging-users-google-tv-excellent-apps-google-io-hero.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjKjPUceDGVyuOm8Dvw_kMpTe5ArHNAYkWBvE0nNNktZiJJt8Lo45Mc4Gj827Y8M8W2hXa2xrRc1OeutuX_PxzcyobyNgzQ4tAyvPOK7j4A_MpynGEss5zJReG2gdZyo6QzsXa5ll77x6BUJjY5MdHBx1AEbDBnpt7gu_VvW8jjhQscpNNvo1QcjkGRWaA/s1600/engaging-users-google-tv-excellent-apps-google-io-hero.png" /></a>
<p>Over the past year, Google TV and Android TV achieved over 270 million monthly active devices, establishing one of the largest smart TV OS footprints. Building on this momentum, we are excited to share new platform features and developer tools designed to help you increase app engagement with our expanding user base.</p>
<iframe allowfullscreen="" class="BLOG_video_class" height="413" src="https://www.youtube.com/embed/OosLbRBM9dA" width="100%" youtube-src-id="OosLbRBM9dA"></iframe>
<h2><span style="font-size: x-large;">Google TV with Gemini capabilities</span></h2>
<p>Earlier this year, we announced that we’ll bring Gemini capabilities to Google TV, so users can speak more naturally and conversationally to find what to watch and get answers to complex questions.</p>
<br />
<p>After each movie or show search, our new voice assistant will suggest relevant content from your apps, significantly increasing the discoverability of your content.</p>
<br />
<p>Plus, users can easily ask questions about topics they're curious about and receive insightful answers with supporting videos.</p>
<p>We’re so excited to bring this helpful and delightful experience to users this fall.</p>
<h2><span style="font-size: x-large;">Video Discovery API</span></h2>
<p>Today, we’ve also opened partner enrollment for our Video Discovery API.</p>
<p>Video Discovery optimizes Resumption, Entitlements, and Recommendations across all Google TV form factors to enhance the end-user experience and boost app engagement.</p>
<ul><ul>
<li><b>Resumption:</b> Partners can now easily display a user's paused video within the 'Continue Watching' row from the home screen. This row is a prime location that drives 60% of all user interactions on Google TV.</li></ul><ul>
<li><b>Entitlements:</b> Video Discovery streamlines entitlement management, which matches app content to user eligibility. Users appreciate this because they can enjoy personalized recommendations without needing to manually update all their subscription details. This allows partners to connect with users across multiple discovery points on Google TV.</li></ul><ul>
<li><b>Recommendations:</b> Video Discovery even highlights personalized content recommendations based on content that users watched inside apps.</li>
</ul></ul>
<p>Partners can begin incorporating the Video Discovery API today, starting with resumption and entitlement integrations. Check out <a href="https://support.google.com/googletv/contact/vda?visit_id=638830411759402759-2269823034&p=vda&rd=1" target="_blank">g.co/tv/vda</a> to learn more.
</p><h2><span style="font-size: x-large;">Jetpack Compose for TV</span></h2>
<br />
<p>Last year, we launched Compose for TV 1.0 beta, which lets you build beautiful, adaptive UIs across Android, including Android TV OS.</p>
<p>Now, <a href="https://developer.android.com/jetpack/androidx/releases/tv" target="_blank">Compose for TV</a> 1.0 is stable, and expands on the core and Material Compose libraries. We’ve even seen how the latest release of Compose significantly improves app startup within our internal benchmarking mobile sample, with roughly a 20% improvement compared with the March 2024 release. Because Compose for TV builds upon these libraries, apps built with Compose for TV should also see better app startup times.</p>
<p>New to building with Compose, and not sure where to start? Our updated <a href="https://github.com/android/compose-samples/tree/main/Jetcaster" target="_blank">Jetcaster</a> audio streaming app sample demonstrates how to use Compose across form factors. It includes a dedicated module for playing podcasts on TV by combining separate view models with shared business logic.</p>
<h2><span style="font-size: x-large;">Focus Management Codelab</span></h2>
<p>We understand that focus management can be challenging at times. That’s why we’ve published a <a href="https://developer.android.com/codelabs/large-screens/keyboard-focus-management-in-compose#0" target="_blank">codelab</a> that reviews how to set initial focus, prepare for unexpected focus traversal, and efficiently restore focus.</p>
<h2><span style="font-size: x-large;">Memory Optimization Guide</span></h2>
<p>We’ve released a comprehensive guide on <a href="https://developer.android.com/training/tv/playback/memory" target="_blank">memory optimization</a>, including memory targets for low RAM devices as well. Combined with Android Studio's powerful memory profiler, this helps you understand when your app exceeds those limits and why.</p>
<h2><span style="font-size: x-large;">In-App Ratings and Reviews</span></h2>
<br />
<p>Moreover, app ratings and reviews are essential for developers, offering quantitative and qualitative feedback on user experiences. Now, we're extending the In-App Ratings and Reviews API to TV to allow developers to prompt users for ratings and reviews directly from Google TV. Check out our recent blog post detailing <a href="https://android-developers.googleblog.com/2025/05/in-app-ratings-and-reviews-for-tv.html" target="_blank">how to easily integrate the In-App Ratings and Reviews API</a>.</p>
<h2><span style="font-size: x-large;">Android 16 for TV</span></h2>
<br />
<p>We're excited to announce the upcoming release of Android 16 for TV. Developers can begin using the latest <a href="https://developer.android.com/tv/release/16" target="_blank">Emulator</a> today. With Android 16, TV developers can access several great features:</p>
<ul><ul>
<li>The <a href="https://developer.android.com/training/tv/playback/adjust-display-settings" target="_blank">MediaQualityManager</a> allows developers to take control over selecting picture profiles.</li></ul><ul>
<li>Platform support for the <a href="https://opensource.googleblog.com/2025/01/introducing-eclipsa-audio-immersive-audio-for-everyone.html" target="_blank">Eclipsa Audio codec</a> enables creators to use the IAMF spatial audio format. For ExoPlayer support that includes previous platform versions, see ExoPlayer's <a href="https://github.com/androidx/media/tree/release/libraries/decoder_iamf" target="_blank">IAMF decoder module</a>.</li></ul><ul>
<li>There are various improvements to media playback speed, consistency and efficiency, as well as HDMI-CEC reliability and performance optimizations for 64-bit kernels.</li></ul><ul>
<li>Additional APIs and user experiences from Android 16 are also available. We invite you to explore the complete list from the <a href="https://developer.android.com/tv/release/16" target="_blank">Android 16 for TV release notes</a>.</li>
</ul></ul>
<h2><span style="font-size: x-large;">What's next</span></h2>
<p>We're incredibly excited to see how these announcements will optimize your development journey, and look forward to seeing the fantastic apps you'll launch on the platform!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-28867126844378929532025-05-20T10:51:00.000-07:002025-05-21T13:34:52.587-07:00Announcing Jetpack Navigation 3<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizVuum2Jg1s5Y5EQmfuXrjm8QqcqxdAUDveRe-CSS4ZGVeNG9VFEizeayzpolJ5oCPSfoNmkT3RdS3Z-g1-aXzDBWwJtjOzflCgt657KFitWQby_GcYa5PO4PBN_7IUmG4CC9BvOw8mFFIMPai_R9EoPIcWZkPDV0aAAs20amwT6Lr2oXi5Yfe-e7b1t0/s1600/announcing-jetpack-navigation-3-compose-google-io-2025.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEizVuum2Jg1s5Y5EQmfuXrjm8QqcqxdAUDveRe-CSS4ZGVeNG9VFEizeayzpolJ5oCPSfoNmkT3RdS3Z-g1-aXzDBWwJtjOzflCgt657KFitWQby_GcYa5PO4PBN_7IUmG4CC9BvOw8mFFIMPai_R9EoPIcWZkPDV0aAAs20amwT6Lr2oXi5Yfe-e7b1t0/s1600/announcing-jetpack-navigation-3-compose-google-io-2025.png" style="display: none;" />
<em>Posted by Don Turner - Developer Relations Engineer</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAaXdjIziCSWC6AxDELcBes5wYDD03GByPFm8jipBg5cilU9-3w_50B5RhF1BXzHUY0HTmp-WEAeshuqLmBpdK9onZAys4S3Pv2izciuQxqlTL23YaOCpaGMoazKbXUCHr5TZcDadxWbDdUDAX_1GS_iN133wFnxH-Qal7icpfcZhKZ2aNy3FaJ0zIDds/s1600/new-in-jetpack-compose.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiAaXdjIziCSWC6AxDELcBes5wYDD03GByPFm8jipBg5cilU9-3w_50B5RhF1BXzHUY0HTmp-WEAeshuqLmBpdK9onZAys4S3Pv2izciuQxqlTL23YaOCpaGMoazKbXUCHr5TZcDadxWbDdUDAX_1GS_iN133wFnxH-Qal7icpfcZhKZ2aNy3FaJ0zIDds/s1600/new-in-jetpack-compose.png" /></a>
<p>Navigating between screens in your app should be simple, shouldn't it? However, building a robust, scalable, and delightful navigation experience can be a challenge. For years, the Jetpack Navigation library has been a key tool for developers, but as the Android UI landscape has evolved, particularly with the rise of Jetpack Compose, we recognized the need for a new approach.</p>
<p>Today, we're excited to introduce <b>Jetpack Navigation 3</b>, a new navigation library built from the ground up specifically for Compose. For brevity, we'll just call it Nav3 from now on. This library embraces the <a href="https://developer.android.com/develop/ui/compose/mental-model" target="_blank">declarative programming model</a> and <a href="https://developer.android.com/develop/ui/compose/state" target="_blank">Compose state</a> as fundamental building blocks.</p>
<h2><span style="font-size: x-large;">Why a new navigation library?</span></h2>
<p>The original Jetpack Navigation library (sometimes referred to as Nav2 as it's on major version 2) was initially announced back in 2018, before AndroidX and before Compose. While it served its original goals well, we heard from you that it had several limitations when working with modern Compose patterns.</p>
<p>One key limitation was that the back stack state could only be observed indirectly. This meant there could be two sources of truth, potentially leading to an inconsistent application state. Also, Nav2's <span style="color: #0d904f; font-family: courier;">NavHost</span> was designed to display only a single destination – the topmost one on the back stack – filling the available space. This made it difficult to implement adaptive layouts that display multiple panes of content simultaneously, such as a list-detail layout on large screens.</p>
<br />
<h2><span style="font-size: x-large;">Founding principles</span></h2>
<p>Nav3 is built upon principles designed to provide greater flexibility and developer control:</p>
<ul><ul>
<li><b>You own the back stack:</b> You, the developer, not the library, own and control the back stack. It's a simple list which is backed by Compose state. Specifically, Nav3 expects your back stack to be <span style="color: #0d904f; font-family: courier;">SnapshotStateList<T></span> where <span style="color: #0d904f; font-family: courier;">T</span> can be any type you choose. You can navigate by adding or removing items (<span style="color: #0d904f; font-family: courier;">T</span>s), and state changes are observed and reflected by Nav3's UI.</li></ul><ul>
<li><b>Get out of your way:</b> We heard that you don't like a navigation library to be a black box with inaccessible internal components and state. Nav3 is designed to be open and extensible, providing you with building blocks and helpful defaults. If you want custom navigation behavior you can <a href="https://developer.android.com/develop/ui/compose/layering" target="_blank">drop down to lower layers</a> and create your own components and customizations.</li></ul><ul>
<li><b>Pick your building blocks:</b> Instead of embedding all behavior within the library, Nav3 offers smaller components that you can combine to create more complex functionality. We've also provided a "<a href="http://github.com/android/nav3-recipes" target="_blank">recipes book</a>" that shows how to combine components to solve common navigation challenges.</li>
</ul></ul><br />
<br />
<h2><span style="font-size: x-large;">Key features</span></h2>
<ul><ul>
<li><b>Animations:</b> Built-in transition animations are provided for changes in destination, including for <a href="https://developer.android.com/develop/ui/compose/system/predictive-back" target="_blank">predictive back</a>. It also has a <a href="https://developer.android.com/guide/navigation/navigation-3/animate-destinations" target="_blank">flexible API for custom animation behavior,</a> allowing animations to be overridden at both the app and the individual screen level.</li></ul><ul>
<li><b>Adaptive layouts:</b> A <a href="https://developer.android.com/guide/navigation/navigation-3/custom-layouts" target="_blank">flexible layout API</a> (named <span style="color: #0d904f; font-family: courier;">Scenes</span>) allows you to render multiple destinations in the same layout (for example, a list-detail layout on large screen devices). This makes it easy to switch between single and multi-pane layouts.</li></ul><ul>
<li><b>State scoping:</b> Enables state to be scoped to destinations on the back stack, including optional <span style="color: #0d904f; font-family: courier;">ViewModel</span> support via a <a href="https://developer.android.com/guide/navigation/navigation-3/save-state#scoping-viewmodels" target="_blank">dedicated Jetpack lifecycle library</a>.</li></ul><ul>
<li><b>Modularity:</b> The API design allows navigation code to be split across multiple modules. This improves build times and allows clear separation of responsibilities between feature modules.</li></ul><ul><br />
<br />
<h2><span style="font-size: x-large;">Basic code example</span></h2>
<p>To give you an idea of how Nav3 works, here's a short code sample.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// Define the routes in your app and any arguments.</span>
data object Home
data <span style="color: green; font-weight: bold;">class</span> <span style="color: blue;">Product</span>(<span style="color: green; font-weight: bold;">val</span> id: String)
<span style="color: #408080; font-style: italic;">// Create a back stack, specifying the route the app should start with.</span>
<span style="color: green; font-weight: bold;">val</span> backStack = remember { mutableStateListOf<Any>(Home) }
<span style="color: #408080; font-style: italic;">// A NavDisplay displays your back stack. Whenever the back stack changes, the display updates.</span>
NavDisplay(
backStack = backStack,
<span style="color: #408080; font-style: italic;">// Specify what should happen when the user goes back</span>
onBack = { backStack.removeLastOrNull() },
<span style="color: #408080; font-style: italic;">// An entry provider converts a route into a NavEntry which contains the content for that route.</span>
entryProvider = { route ->
<span style="color: green; font-weight: bold;">when</span> (route) {
<span style="color: green; font-weight: bold;">is</span> Home -> NavEntry(route) {
Column {
Text(<span style="color: #ba2121;">"Welcome to Nav3"</span>)
Button(onClick = {
<span style="color: #408080; font-style: italic;">// To navigate to a new route, just add that route to the back stack</span>
backStack.add(Product(<span style="color: #ba2121;">"123"</span>))
}) {
Text(<span style="color: #ba2121;">"Click to navigate"</span>)
}
}
}
<span style="color: green; font-weight: bold;">is</span> Product -> NavEntry(route) {
Text(<span style="color: #ba2121;">"Product ${route.id} "</span>)
}
<span style="color: green; font-weight: bold;">else</span> -> NavEntry(Unit) { Text(<span style="color: #ba2121;">"Unknown route: $route"</span>) }
}
}
)
</pre></div>
<h2><span style="font-size: x-large;">Get started and provide feedback</span></h2>
<p>To get started, check out <a href="https://goo.gle/nav3" target="_blank">the developer documentation</a>, plus the <a href="https://github.com/android/nav3-recipes" target="_blank">recipes repository</a> which provides examples for:</p>
<ul><ul>
<li>common navigation UI, such as a navigation rail or bar</li>
<li>conditional navigation, such as a login flow</li>
<li>custom layouts using <span style="color: #0d904f; font-family: courier;">Scenes</span></li>
</ul></ul>
<p>We plan to provide code recipes, documentation and blogs for more complex use cases in future.</p>
<p>Nav3 is currently in alpha, which means that the API is liable to change based on feedback. If you have any issues, or would like to provide feedback, please <a href="https://issuetracker.google.com/issues/new?component=1750212&template=2102223" target="_blank">file an issue</a>.</p>
<p>Nav3 offers a flexible and powerful foundation for building modern navigation in your Compose applications. We're really excited to see what you build with it.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p> </ul></ul>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-32169342359719205662025-05-20T10:50:00.000-07:002025-05-20T11:59:31.166-07:00Android’s Kotlin Multiplatform announcements at Google I/O and KotlinConf 25<meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE4JdIUT50f7W50YL6eGC3Mc_8PLWqQICySsTb3y6TthBWj56b-jJvSuJ-dcRGgSnlzwHIzZcF_dHvmIj78B19ZsBsqFDpyfaM-a6K1vLdfuYiixmeeRnkh9YtqaqqT6EBF5CUHbOOquXPNPvai_0otY2POILh98k61B-s3KqkaWTFWlk5yyykCC5rXUc/s1600/Op1_AndroidKoitlin_Multiplatform_SharedModule_Blogger.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiE4JdIUT50f7W50YL6eGC3Mc_8PLWqQICySsTb3y6TthBWj56b-jJvSuJ-dcRGgSnlzwHIzZcF_dHvmIj78B19ZsBsqFDpyfaM-a6K1vLdfuYiixmeeRnkh9YtqaqqT6EBF5CUHbOOquXPNPvai_0otY2POILh98k61B-s3KqkaWTFWlk5yyykCC5rXUc/s1600/Op1_AndroidKoitlin_Multiplatform_SharedModule_Blogger.png">
<em>Posted by Ben Trengrove - Developer Relations Engineer, Matt Dyor - Product Manager</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy8iK7IqYhEQuLt9A9CZ3Jk30P9m5kWXeDmcA4ZHwYOuBctn4rhsLeA9tX7C693r-RT0LXZ7XLjRY6y_OtXWZ1LlyXBpeR2aXWG6ghkDB5BrBVZb0Y_G5HasZg_yGuR-ImMjemCQN4ns3IJFNX75PsoKVgTYj7TrtyTzLIrQgkr-RVXWL3MkwJ7agA1mo/s1600/Op1_AndroidKoitlin_Multiplatform_SharedModule_Hero_Blog.png" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgy8iK7IqYhEQuLt9A9CZ3Jk30P9m5kWXeDmcA4ZHwYOuBctn4rhsLeA9tX7C693r-RT0LXZ7XLjRY6y_OtXWZ1LlyXBpeR2aXWG6ghkDB5BrBVZb0Y_G5HasZg_yGuR-ImMjemCQN4ns3IJFNX75PsoKVgTYj7TrtyTzLIrQgkr-RVXWL3MkwJ7agA1mo/s1600/Op1_AndroidKoitlin_Multiplatform_SharedModule_Hero_Blog.png" data-original-width="100%" data-original-height="800" /></a>
<p>Google I/O and KotlinConf 2025 bring a series of announcements on Android’s Kotlin and Kotlin Multiplatform efforts. Here’s what to watch out for:</p>
<h2><span style="font-size: x-large ;">Announcements from Google I/O 2025</span></h2>
<h3><span style="font-size: large ;">Jetpack libraries</span></h3>
Our focus for Jetpack libraries and KMP is on sharing business logic across Android and iOS, but we have begun experimenting with web/WASM support.
<p>We are adding KMP support to Jetpack libraries. Last year we started with <a href="https://developer.android.com/kotlin/multiplatform/room" target="_blank">Room</a>, <a href="https://developer.android.com/kotlin/multiplatform/datastore" target="_blank">DataStore</a> and Collection, which are now available in a stable release and recently we have added ViewModel, SavedState and Paging. The levels of support that our Jetpack libraries guarantee for each platform have been categorised into <a href="https://developer.android.com/kotlin/multiplatform#supported-platforms" target="_blank">three tiers</a>, with the top tier being for Android, iOS and JVM.
<h3><span style="font-size: large ;">Tool improvements</span></h3>
<p>We're developing new tools to help easily start using KMP in your app. With the KMP new <a href="https://android-developers.googleblog.com/2025/05/kotlin-multiplatform-shared-module-templates.html" target="_blank">module template</a> in Android Studio Meerkat, you can <a href="https://developer.android.com/kotlin/multiplatform/migrate" target="_blank">add a new module to an existing app</a> and share code to iOS and other supported KMP platforms.</p>
<p>In addition to KMP enhancements, Android Studio now supports Kotlin K2 mode for Android specific features requiring language support such as Live Edit, Compose Preview and many more.</p>
<h3><span style="font-size: large ;">How Google is using KMP</span></h3>
<p>Last year, Google Workspace began experimenting with KMP, and this is now running in production in the Google Docs app on iOS. The app’s runtime performance is on par or better than before<sup>1</sup>.
<p>It’s been helpful to have an app at this scale test KMP out, because we’re able to identify issues and fix issues that benefit the KMP developer community.</p>
<p>For example, we've upgraded the Kotlin Native compiler to LLVM 16 and contributed a more efficient garbage collector and string implementation. We're also bringing the static analysis power of Android Lint to Kotlin targets and ensuring a unified Gradle DSL for both AGP and KGP to improve the plugin management experience.</p>
<h3><span style="font-size: large ;">New guidance</span></h3>
<p>We're providing comprehensive guidance in the form of two new codelabs: <a href="https://developer.android.com/codelabs/kmp-get-started" target="_blank">Getting started with Kotlin Multiplatform</a> and <a href="https://developer.android.com/codelabs/kmp-migrate-room" target="_blank">Migrating your Room database to KMP</a>, to help you get from standalone Android and iOS apps to shared business logic.</p>
<h3><span style="font-size: large ;">Kotlin Improvements</span></h3>
<p>Kotlin Symbol Processing (<a href="https://github.com/google/ksp?tab=readme-ov-file#ksp2-is-here" target="_blank">KSP2</a>) is stable to better support new Kotlin language features and deliver better performance. It is easier to integrate with build systems, is thread-safe, and has better support for debugging annotation processors. In contrast to KSP1, KSP2 has much better compatibility across different Kotlin versions. The rewritten command line interface also becomes significantly easier to use as it is now a standalone program instead of a compiler plugin.</p>
<h2><span style="font-size: x-large ;">KotlinConf 2025</span></h2>
<p>Google team members are presenting a number of talks at <a href="https://kotlinconf.com/" target="_blank">KotlinConf</a> spanning multiple topics:</p>
<h3><span style="font-size: large ;">Talks</span></h3>
<ul><ul>
<li><b>Deploying KMP at Google Workspace</b> by Jason Parachoniak, Troels Lund, and Johan Bay from the Workspace team discusses the challenges and solutions, including bugs and performance optimizations, encountered when launching Kotlin Multiplatform at Google Workspace, offering comparisons to ObjectiveC and a Q&A. (Technical Session)</li></ul><br/><ul>
<li><b>The Life and Death of a Kotlin/Native Object</b> by Troels Lund offers a high-level explanation of the Kotlin/Native runtime's inner workings concerning object instantiation, memory management, and disposal. (Technical Session)</li></ul><br/><ul>
<li><b>APIs: How Hard Can They Be?</b> presented by Aurimas Liutikas and Alan Viverette from the Jetpack team delves into the lifecycle of API design, review processes, and evolution within AndroidX libraries, particularly considering KMP and related tools. (Technical Session)</li></ul><br/><ul>
<li><b>Project Sparkles: How Compose for Desktop is changing Android Studio and IntelliJ</b> with Chris Sinco and Sebastiano Poggi from the Android Studio team introduces the initiative ('Project Sparkles') aiming to modernize Android Studio and IntelliJ UIs using Compose for Desktop, covering goals, examples, and collaborations. (Technical Session)</li></ul><br/><ul>
<li><b>JSpecify: Java Nullness Annotations and Kotlin</b> presented by David Baker explains the significance and workings of JSpecify's standard Java nullness annotations for enhancing Kotlin's interoperability with Java libraries. (Lightning Session)</li></ul><br/><ul>
<li><b>Lessons learned decoupling Architecture Components from platform specific cod</b>e features Jeremy Woods and Marcello Galhardo from the Jetpack team sharing insights from the Android team on decoupling core components like SavedState and System Back from platform specifics to create common APIs. (Technical Session)</li></ul><br/><ul>
<li><b>KotlinConf’s Closing Panel</b>, a regular staple of the conference, returns, featuring Jeffrey van Gogh as Google’s representative on the panel. (Panel)</li>
</ul></ul>
<h3><span style="font-size: large ;">Live Workshops</span></h3>
<p>If you are at KotlinConf in person, we will have guided live workshops with our new codelabs from above.</p>
<ul><ul>
<li>The codelab <a href="https://developer.android.com/kmp-get-started" target="_blank">Get Started With Kotlin Multiplatform for Shared Business Logic</a>, offered by Matt Dyor, Dustin Lam, and Tomáš Mlynarič, provides hands-on guidance for extracting business logic from native Android and iOS apps into a shared KMP module.</li></ul><br/><ul>
<li>The codelab <a href="https://developer.android.com/kmp-migrate-room" target="_blank">Migrating Room to Room KMP</a>, also led by Matt Dyor, and Dustin Lam, Tomáš Mlynarič, demonstrates the process of migrating an existing Room database implementation to Room KMP within a shared module.</li>
</ul></ul>
<p>We love engaging with the Kotlin community. If you are attending KotlinConf, we hope you get a chance to check out our booth, with opportunities to chat with our engineers, get your questions answered, and learn more about how you can leverage Kotlin and KMP.</p>
<h2><span style="font-size: x-large ;">Learn more about Kotlin Multiplatform</span></h2>
<p>To learn more about KMP and start sharing your business logic across platforms, check out our <a href="https://developer.android.com/kotlin/multiplatform" target="_blank">documentation</a> and the <a href="https://github.com/android/kotlin-multiplatform-samples/tree/main" target="_blank">sample</a>.
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br/>
<small><sup>1</sup> <i>Google Internal Data, March 2025</i></small>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-82201882355990551962025-05-20T10:49:00.000-07:002025-05-20T11:59:12.208-07:00Androidify: Building powerful AI-driven experiences with Jetpack Compose, Gemini and CameraX<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifotJSMi9EWKPDLuL5blfsijASN368RrWM7eCHDXuBRBXoeIZtoN79Yd5NDnyytO9lu8SmDsIIPbTwpruPOH93Wt0hginP6iJAOj-49clRPfocOCh_z5LIWpMwGvt_YuPldFS-FGwtVjgeJiGfwTMfjCw0iIknmjhvb4BIo9da9vbpuWplFQa8M-dIJEg/s1600/androidfy-google-io-2025.gif" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEifotJSMi9EWKPDLuL5blfsijASN368RrWM7eCHDXuBRBXoeIZtoN79Yd5NDnyytO9lu8SmDsIIPbTwpruPOH93Wt0hginP6iJAOj-49clRPfocOCh_z5LIWpMwGvt_YuPldFS-FGwtVjgeJiGfwTMfjCw0iIknmjhvb4BIo9da9vbpuWplFQa8M-dIJEg/s1600/androidfy-google-io-2025.gif" style="display: none;" />
<em>Posted by Rebecca Franks – Developer Relations Engineer</em>
<p>The Android bot is a beloved mascot for Android users and developers, with previous versions of the bot builder being very popular - we decided that this year we’d rebuild the bot maker from the ground up, using the latest technology backed by Gemini. Today we are releasing a new <a href="http://github.com/android/androidify" target="_blank">open source app, Androidify</a>, for learning how to build powerful AI driven experiences on Android using the latest technologies such as Jetpack Compose, Gemini through Firebase, CameraX, and Navigation 3.</p>
<br />
<h2><span style="font-size: x-large;">Androidify app demo</span></h2>
<p>Here’s an example of the app running on the device, showcasing converting a photo to an Android bot that represents my likeness:</p>
<br />
<h2><span style="font-size: x-large;">AI in Androidify with Gemini and ML Kit</span></h2>
<p>The Androidify app uses the Gemini models in a multitude of ways to enrich the app experience, all powered by the <a href="https://firebase.google.com/docs/vertex-ai" target="_blank">Firebase AI Logic SDK</a>. The app uses Gemini 2.5 Flash and Imagen 3 under the hood:</p>
<ul><ul>
<li><b>Image validation:</b> We ensure that the captured image contains sufficient information, such as a clearly focused person, and assessing for safety. This feature uses the multi-modal capabilities of Gemini API, by giving it a prompt and image at the same time:</li></ul></ul><br />
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> response = generativeModel.generateContent(
content {
text(prompt)
image(image)
},
)
</pre></div><br />
<ul><ul>
<li><b>Text prompt validation:</b> If the user opts for text input instead of image, we use Gemini 2.5 Flash to ensure the text contains a sufficiently descriptive prompt to generate a bot.</li></ul><br /><ul>
<li><b>Image captioning:</b> Once we’re sure the image has enough information, we use Gemini 2.5 Flash to perform image captioning., We ask Gemini to be as descriptive as possible,focusing on the clothing and its colors.</li></ul><br /><ul>
<li><b>“Help me write” feature:</b> Similar to an “I’m feeling lucky” type feature, “Help me write” uses Gemini 2.5 Flash to create a random description of the clothing and hairstyle of a bot.</li></ul><br /><ul>
<li><b>Image generation from the generated prompt:</b> As the final step, Imagen generates the image, providing the prompt and the selected skin tone of the bot.</li>
</ul></ul>
<p>The app also uses the <a href="https://developers.google.com/ml-kit/vision/pose-detection" target="_blank">ML Kit pose detection</a> to detect a person in the viewfinder and enable the capture button when a person is detected, as well as adding fun indicators around the content to indicate detection.</p>
<p>Explore more detailed information about <a href="https://android-developers.googleblog.com/2025/05/androidify-how-androidify-leverages-gemini-firebase-ml-kit.htm" target="_blank">AI usage in Androidify</a>.
</p><h2><span style="font-size: x-large;">Jetpack Compose</span></h2>
<p>The user interface of Androidify is built using Jetpack Compose, the modern UI toolkit that simplifies and accelerates UI development on Android.</p>
<h3><span style="font-size: x-large;">Delightful details with the UI</span></h3>
<p>The app uses <a href="https://m3.material.io/blog/building-with-m3-expressive?utm_source=blog&utm_medium=motion&utm_campaign=IO25" target="_blank">Material 3 Expressive</a>, the latest alpha release that makes your apps more premium, desirable, and engaging. It provides delightful bits of UI out-of-the-box, like new shapes, componentry, and using the <span style="color: #0d904f; font-family: courier;">MotionScheme</span> variables wherever a motion spec is needed.</p>
<p><span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/material3/MaterialShapes" target="_blank">MaterialShapes</a></span> are used in various locations. These are a preset list of shapes that allow for easy morphing between each other—for example, the cute cookie shape for the camera capture button:</p><br />

<p>Beyond using the standard Material components, Androidify also features custom composables and delightful transitions tailored to the specific needs of the app:</p>
<ul><ul>
<li>There are plenty of shared element transitions across the app—for example, a morphing shape shared element transition is performed between the “take a photo” button and the camera surface.</li></ul><br /><ul>
<br />
<li>Custom enter transitions for the <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/results/src/main/java/com/android/developers/androidify/results/ResultsScreen.kt#L101" target="_blank">ResultsScreen</a></span> with the usage of marquee modifiers.</li></ul><br /><ul>
<br />
<li>Fun color splash animation as a transition between screens.</li></ul><br /><ul>
<br />
<li>Animating gradient buttons for the AI-powered actions.</li></ul><br /><ul>
<br />
</ul></ul>
<p> To learn more about the unique details of the UI, read <a href="https://android-developers.googleblog.com/2025/05/androidify-building-delightful-ui-with-compose.html" target="_blank">Androidify: Building delightful UIs with Compose</a>
</p><h2><span style="font-size: x-large;">Adapting to different devices</span></h2>
<p>Androidify is designed to look great and function seamlessly across candy bar phones, foldables, and tablets. The general goal of developing adaptive apps is to <b>avoid reimplementing the same app multiple times on each form factor</b> by extracting out reusable composables, and leveraging APIs like <span style="color: #0d904f; font-family: courier;">WindowSizeClass</span> to determine what kind of layout to display.</p>
<br />
<p>For Androidify, we only needed to leverage the width window size class. Combining this with different layout mechanisms, we were able to reuse or extend the composables to cater to the multitude of different device sizes and capabilities.</p>
<ul><ul>
<li><b>Responsive layouts:</b> The <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/creation/src/main/java/com/android/developers/androidify/creation/CreationScreen.kt#L156" target="_blank"></a><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/creation/src/main/java/com/android/developers/androidify/creation/CreationScreen.kt#L156" target="_blank">CreationScreen</a></span> demonstrates adaptive design. It uses helper functions like <span style="color: #0d904f; font-family: courier;">isAtLeastMedium()</span> to detect window size categories and adjust its layout accordingly. On larger windows, the image/prompt area and color picker might sit side-by-side in a <span style="color: #0d904f; font-family: courier;">Row</span>, while on smaller windows, the color picker is accessed via a <span style="color: #0d904f; font-family: courier;">ModalBottomSheet</span>. This pattern, called “supporting pane”, highlights the supporting dependencies between the main content and the color picker.</li></ul><br /><ul>
<li><b>Foldable support:</b> The app actively checks for foldable device features. The camera screen uses <span style="color: #0d904f; font-family: courier;">WindowInfoTracker</span> to get <span style="color: #0d904f; font-family: courier;">FoldingFeature</span> information to adapt to different features by optimizing the layout for tabletop posture.</li></ul><br /><ul>
<li><b>Rear display:</b> Support for devices with multiple displays is included via the <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/camera/src/main/java/com/android/developers/androidify/camera/RearCameraUseCase.kt#L42" target="_blank">RearCameraUseCase</a></span>, allowing for the device camera preview to be shown on the external screen when the device is unfolded (so the main content is usually displayed on the internal screen).</li>
</ul></ul>
<p>Using window size classes, coupled with creating a custom <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/core/util/src/main/java/com/android/developers/androidify/util/AdaptivePreview.kt#L59" target="_blank">@LargeScreensPreview</a></span> annotation, helps achieve unique and useful UIs across the spectrum of device sizes and window sizes.</p>
<h2><span style="font-size: x-large;">CameraX and Media3 Compose</span></h2>
<p>To allow users to base their bots on photos, Androidify integrates <a href="https://developer.android.com/media/camera/camerax" target="_blank">CameraX</a>, the Jetpack library that makes camera app development easier.</p>
<p>The app uses a custom <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/camera/src/main/java/com/android/developers/androidify/camera/CameraLayout.kt#L52" target="_blank">CameraLayout</a></span> composable that supports the layout of the typical composables that a camera preview screen would include— for example, zoom buttons, a capture button, and a flip camera button. This layout adapts to different device sizes and more advanced use cases, like the tabletop mode and rear-camera display. For the actual rendering of the camera preview, it uses the new <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/camera/compose/package-summary#CameraXViewfinder%28androidx.camera.core.SurfaceRequest,androidx.compose.ui.Modifier,androidx.camera.viewfinder.core.ImplementationMode,androidx.camera.viewfinder.compose.MutableCoordinateTransformer%29" target="_blank">CameraXViewfinder</a></span> that is part of the <span style="color: #0d904f; font-family: courier;">camerax-compose</span> artifact.</p>
<br />
<br />
<p>The app also integrates with <a href="https://developer.android.com/media/media3" target="_blank">Media3</a> APIs to load an instructional video for showing how to get the best bot from a prompt or image. Using the new <span style="color: #0d904f; font-family: courier;">media3-ui-compose</span> artifact, we can easily add a <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/feature/home/src/main/java/com/android/developers/androidify/home/HomeScreen.kt#L517" target="_blank">VideoPlayer</a></span> into the app:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">@Composable
<span style="color: #008000; font-weight: bold">private</span> <span style="color: #008000; font-weight: bold">fun</span> <span style="color: #0000FF">VideoPlayer</span>(modifier: Modifier = Modifier) {
<span style="color: #008000; font-weight: bold">val</span> context = LocalContext.current
<span style="color: #008000; font-weight: bold">var</span> player by remember { mutableStateOf<Player?>(<span style="color: #008000; font-weight: bold">null</span>) }
LifecycleStartEffect(Unit) {
player = ExoPlayer.Builder(context).build().apply {
setMediaItem(MediaItem.fromUri(Constants.PROMO_VIDEO))
repeatMode = Player.REPEAT_MODE_ONE
prepare()
}
onStopOrDispose {
player?.release()
player = <span style="color: #008000; font-weight: bold">null</span>
}
}
Box(
modifier
.background(MaterialTheme.colorScheme.surfaceContainerLowest),
) {
player?.let { currentPlayer ->
PlayerSurface(currentPlayer, surfaceType = SURFACE_TYPE_TEXTURE_VIEW)
}
}
}
</pre></div><br/>
<p>Using the new <span style="font-family: courier;">onLayoutRectChanged</span> modifier, we also listen for whether the composable is completely visible or not, and play or pause the video based on this information:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">var</span> videoFullyOnScreen by remember { mutableStateOf(<span style="color: #008000; font-weight: bold">false</span>) }
LaunchedEffect(videoFullyOnScreen) {
<span style="color: #008000; font-weight: bold">if</span> (videoFullyOnScreen) currentPlayer.play() <span style="color: #008000; font-weight: bold">else</span> currentPlayer.pause()
}
<span style="color: #408080; font-style: italic">// We add this onto the player composable to determine if the video composable is visible, and mutate the videoFullyOnScreen variable, that then toggles the player state. </span>
Modifier.onVisibilityChanged(
containerWidth = LocalView.current.width,
containerHeight = LocalView.current.height,
) { fullyVisible -> videoFullyOnScreen = fullyVisible }
<span style="color: #408080; font-style: italic">// A simple version of visibility changed detection</span>
<span style="color: #008000; font-weight: bold">fun</span> Modifier.onVisibilityChanged(
containerWidth: Int,
containerHeight: Int,
onChanged: (visible: Boolean) -> Unit,
) = <span style="color: #008000; font-weight: bold">this</span> then Modifier.onLayoutRectChanged(<span style="color: #666666">100</span>, <span style="color: #666666">0</span>) { layoutBounds ->
onChanged(
layoutBounds.boundsInRoot.top > <span style="color: #666666">0</span> &&
layoutBounds.boundsInRoot.bottom < containerHeight &&
layoutBounds.boundsInRoot.left > <span style="color: #666666">0</span> &&
layoutBounds.boundsInRoot.right < containerWidth,
)
}
</pre></div>
<p>Additionally, using <span style="color: #0d904f; font-family: courier;">rememberPlayPauseButtonState</span>, we add on a layer on top of the player to offer a play/pause button on the video itself:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> playPauseButtonState = rememberPlayPauseButtonState(currentPlayer)
OutlinedIconButton(
onClick = playPauseButtonState::onClick,
enabled = playPauseButtonState.isEnabled,
) {
<span style="color: #008000; font-weight: bold">val</span> icon =
<span style="color: #008000; font-weight: bold">if</span> (playPauseButtonState.showPlay) R.drawable.play <span style="color: #008000; font-weight: bold">else</span> R.drawable.pause
<span style="color: #008000; font-weight: bold">val</span> contentDescription =
<span style="color: #008000; font-weight: bold">if</span> (playPauseButtonState.showPlay) R.string.play <span style="color: #008000; font-weight: bold">else</span> R.string.pause
Icon(
painterResource(icon),
stringResource(contentDescription),
)
}
</pre></div>
<p>Check out the code for more details on <a href="http://github.com/android/androidify" target="_blank">how CameraX and Media3 were used in Androidify</a>.
<h2><span style="font-size: x-large;">Navigation 3</span></h2>
<p>Screen transitions are handled using the new Jetpack Navigation 3 library <span style="color: #0d904f; font-family: courier;">androidx.navigation3</span>. The <span style="font-family: courier;"><a href="https://github.com/android/androidify/blob/169b2d521b0743af765e8d52dd714029d4bf24cc/app/src/main/java/com/android/developers/androidify/navigation/MainNavigation.kt#L63" target="_blank">MainNavigation</a></span> composable defines the different destinations (Home, Camera, Creation, About) and displays the content associated with each destination using <span style="color: #0d904f; font-family: courier;">NavDisplay</span>. You get full control over your back stack, and navigating to and from destinations is as simple as adding and removing items from a list.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">@Composable
<span style="color: #008000; font-weight: bold">fun</span> <span style="color: #0000FF">MainNavigation</span>() {
<span style="color: #008000; font-weight: bold">val</span> backStack = rememberMutableStateListOf<NavigationRoute>(Home)
NavDisplay(
backStack = backStack,
onBack = { backStack.removeLastOrNull() },
entryProvider = entryProvider {
entry<Home> { entry ->
HomeScreen(
onAboutClicked = {
backStack.add(About)
},
)
}
entry<Camera> {
CameraPreviewScreen(
onImageCaptured = { uri ->
backStack.add(Create(uri.toString()))
},
)
}
<span style="color: #408080; font-style: italic">// etc</span>
},
)
}
</pre></div>
<p>Notably, Navigation 3 exposes a new composition local, <span style="color: #0d904f; font-family: courier;">LocalNavAnimatedContentScope</span>, to easily integrate your shared element transitions without needing to keep track of the scope yourself. By default, Navigation 3 also integrates with predictive back, providing delightful back experiences when navigating between screens, as seen in this prior shared element transition:</p>
<br />
<p>Learn more about <a href="http://goo.gle/nav3" target="_blank">Jetpack Navigation 3, currently in alpha</a>.</p>
<h2><span style="font-size: x-large;">Learn more</span></h2>
<p>By combining the declarative power of Jetpack Compose, the camera capabilities of CameraX, the intelligent features of Gemini, and thoughtful adaptive design, Androidify is a personalized avatar creation experience that feels right at home on any Android device. You can find the full code sample at <a href="http://github.com/android/androidify" target="_blank">github.com/android/androidify</a> where you can see the app in action and be inspired to build your own AI-powered app experiences.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br />
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-8356179759328572572025-05-20T10:48:00.001-07:002025-05-20T12:39:46.655-07:00Androidify: Building delightful UIs with Compose<meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiuRcXkPQvPqpnXVGz8yrQY5qvmlvk6xlX0hFMTEFslcCNu27BKEcDLzvbW1nYOvRZmVd7aQ0mUpCSRRNw5OjwFjxDW-0gkkaLPb_aVav22fix6islazqbC2xexAiM1sxjF_nRgjlR5ffcXFIlW3673h9wBml8b0CdSZPqGezIRG-xxROyh2-BRY6ZPPko/s1600/blog_banner_androidify_dev_2.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiuRcXkPQvPqpnXVGz8yrQY5qvmlvk6xlX0hFMTEFslcCNu27BKEcDLzvbW1nYOvRZmVd7aQ0mUpCSRRNw5OjwFjxDW-0gkkaLPb_aVav22fix6islazqbC2xexAiM1sxjF_nRgjlR5ffcXFIlW3673h9wBml8b0CdSZPqGezIRG-xxROyh2-BRY6ZPPko/s1600/blog_banner_androidify_dev_2.png">
<em>Posted by Rebecca Franks - Developer Relations Engineer</em>
<div><br/></div>
<p>Androidify is a new <a href="http://github.com/android/androidify" target="_blank">sample app</a> we built using the latest best practices for mobile apps. Previously, we covered <a href="https://android-developers.googleblog.com/2025/05/androidify-building-ai-driven-experiences-jetpack-compose-gemini-camerax.html" target="_blank">all the different features of the app</a>, from Gemini integration and CameraX functionality to adaptive layouts. In this post, we dive into the Jetpack Compose usage throughout the app, building upon our base knowledge of Compose to add delightful and expressive touches along the way!</p>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEglSlE_fOzV5wgjf9ATKctAilcuw_2ANuKyWA0aWVE4FMPv_YkI-l0eUoiMzBjBFzOKXTd2qKS1U3M4HC25C8lIcY7ahbbGxdmacOSqqv7qpaDnZELUh9WfQleKrO3OAnkk84HlsI0EvhUlSMNICfk1AsZ1XvkKFAraQmoeOSr1F3dgSzGWQoBGNW6D0Iw/s1600/android-developers-androidify-compose.png" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEglSlE_fOzV5wgjf9ATKctAilcuw_2ANuKyWA0aWVE4FMPv_YkI-l0eUoiMzBjBFzOKXTd2qKS1U3M4HC25C8lIcY7ahbbGxdmacOSqqv7qpaDnZELUh9WfQleKrO3OAnkk84HlsI0EvhUlSMNICfk1AsZ1XvkKFAraQmoeOSr1F3dgSzGWQoBGNW6D0Iw/s1600/android-developers-androidify-compose.png" data-original-width="100%"/></a>
<h2><span style="font-size: x-large;">Material 3 Expressive</span></h2>
<p>Material 3 Expressive is an expansion of the Material 3 design system. It’s a set of new features, updated components, and design tactics for creating emotionally impactful UX.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="n17dnMChX14" width="100%" height="498" src="https://www.youtube.com/embed/n17dnMChX14"></iframe><br/>
<p>It’s been released as part of the alpha version of the Material 3 artifact (<span style="color: #0d904f ; font-family: courier ;">androidx.compose.material3:material3:1.4.0-alpha10</span>) and contains a wide range of new components you can use within your apps to build more personalized and delightful experiences. Learn more about <a href="https://m3.material.io/blog/building-with-m3-expressive?utm_source=blog&utm_medium=motion&utm_campaign=IO25" target="_blank">Material 3 Expressive's component and theme updates for more engaging and user-friendly products</a>.</p>
<br/>
<p>In addition to the new component updates, <a href="https://m3.material.io/blog/m3-expressive-motion-theming" target="_blank">Material 3 Expressive introduces a new motion physics system</a> that's encompassed in the Material theme.</p>
<p>In Androidify, we’ve utilized Material 3 Expressive in a few different ways across the app. For example, we’ve explicitly opted-in to the new <span style="color: #0d904f ; font-family: courier ;">MaterialExpressiveTheme</span> and chosen <span style="color: #0d904f ; font-family: courier ;">MotionScheme.expressive()</span> (this is the default when using expressive) to add a bit of playfulness to the app:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">@Composable
<span style="color: #008000; font-weight: bold">fun</span> <span style="color: #0000FF">AndroidifyTheme</span>(
content: @Composable () -> Unit,
) {
<span style="color: #008000; font-weight: bold">val</span> colorScheme = LightColorScheme
MaterialExpressiveTheme(
colorScheme = colorScheme,
typography = Typography,
shapes = shapes,
motionScheme = MotionScheme.expressive(),
content = {
SharedTransitionLayout {
CompositionLocalProvider(LocalSharedTransitionScope provides <span style="color: #008000; font-weight: bold">this</span>) {
content()
}
}
},
)
}
</pre></div>
<p>Some of the new componentry is used throughout the app, including the <span style="font-family: courier ;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/material3/package-summary?hl=en#HorizontalFloatingToolbar%28kotlin.Boolean,androidx.compose.ui.Modifier,androidx.compose.material3.FloatingToolbarColors,androidx.compose.foundation.layout.PaddingValues,androidx.compose.material3.FloatingToolbarScrollBehavior,androidx.compose.ui.graphics.Shape,kotlin.Function1,kotlin.Function1,androidx.compose.ui.unit.Dp,androidx.compose.ui.unit.Dp,kotlin.Function1%29" target="_blank">HorizontalFloatingToolbar</a></span> for the Prompt type selection:</p>
<br/>
<p>The app also uses <span style="font-family: courier ;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/material3/MaterialShapes" target="_blank">MaterialShapes</a></span> in various locations, which are a preset list of shapes that allow for easy morphing between each other. For example, check out the cute cookie shape for the camera capture button:</p>
<br/>
<h2><span style="font-size: x-large;">Animations</span></h2>
<p>Wherever possible, the app leverages the Material 3 Expressive <span style="color: #0d904f ; font-family: courier ;">MotionScheme</span> to obtain a themed motion token, creating a consistent motion feeling throughout the app. For example, the scale animation on the camera button press is powered by <span style="color: #0d904f ; font-family: courier ;">defaultSpatialSpec()</span>, a specification used for animations that move something across a screen (such as x,y or rotation, scale animations):</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> interactionSource = remember { MutableInteractionSource() }
<span style="color: #008000; font-weight: bold">val</span> animationSpec = MaterialTheme.motionScheme.defaultSpatialSpec<Float>()
Spacer(
modifier
.indication(interactionSource, ScaleIndicationNodeFactory(animationSpec))
.clip(MaterialShapes.Cookie9Sided.toShape())
.size(size)
.drawWithCache {
<span style="color: #408080; font-style: italic">//.. etc</span>
},
)
</pre></div><br/>
<br/>
<h2><span style="font-size: x-large;">Shared element animations</span></h2>
<p>The app uses shared element transitions between different screen states. Last year, we showcased how you can <a href="https://developer.android.com/develop/ui/compose/animation/shared-elements" target="_blank">create shared elements in Jetpack Compose</a>, and we’ve extended this in the Androidify sample to create a fun example. It combines the new Material 3 Expressive <span style="color: #0d904f ; font-family: courier ;">MaterialShapes</span>, and performs a transition with a morphing shape animation:</p>
<br/>
<p>To do this, we created a custom <span style="color: #0d904f ; font-family: courier ;">Modifier</span> that takes in the target and resting shapes for the <span style="color: #0d904f ; font-family: courier ;">sharedBounds</span> transition:</p>
<!-- HTML generated using hilite.me --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">@Composable
<span style="color: #008000; font-weight: bold">fun</span> Modifier.sharedBoundsRevealWithShapeMorph(
sharedContentState:
SharedTransitionScope.SharedContentState,
sharedTransitionScope: SharedTransitionScope =
LocalSharedTransitionScope.current,
animatedVisibilityScope: AnimatedVisibilityScope =
LocalNavAnimatedContentScope.current,
boundsTransform: BoundsTransform =
MaterialTheme.motionScheme.sharedElementTransitionSpec,
resizeMode: SharedTransitionScope.ResizeMode =
SharedTransitionScope.ResizeMode.RemeasureToBounds,
<b>restingShape: RoundedPolygon = RoundedPolygon.rectangle().normalized(),
targetShape: RoundedPolygon = RoundedPolygon.circle().normalized(),</b>
)
</pre></div><br/>
<p>Then, we apply a custom <span style="color: #0d904f ; font-family: courier ;">OverlayClip</span> to provide the morphing shape, by tying into the <span style="color: #0d904f ; font-family: courier ;">AnimatedVisibilityScope</span> provided by the <span style="color: #0d904f ; font-family: courier ;">LocalNavAnimatedContentScope</span>:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> animatedProgress =
animatedVisibilityScope.transition.animateFloat(targetValueByState = targetValueByState)
<span style="color: #008000; font-weight: bold">val</span> morph = remember {
Morph(restingShape, targetShape)
}
<span style="color: #008000; font-weight: bold">val</span> morphClip = MorphOverlayClip(morph, { animatedProgress.value })
<span style="color: #008000; font-weight: bold">return</span> <span style="color: #008000; font-weight: bold">this</span>@sharedBoundsRevealWithShapeMorph
.sharedBounds(
sharedContentState = sharedContentState,
animatedVisibilityScope = animatedVisibilityScope,
boundsTransform = boundsTransform,
resizeMode = resizeMode,
clipInOverlayDuringTransition = morphClip,
renderInOverlayDuringTransition = renderInOverlayDuringTransition,
)
</pre></div><br/>
<p>View the <a href="https://github.com/android/androidify/blob/770f690d74bbaccaa5e8cd0fa88c4bdcf244b87c/core/theme/src/main/java/com/android/developers/androidify/theme/SharedElementsConfig.kt#L185" target="_blank">full code snippet for this <span style="font-family: courier ;">Modifer</span> on GitHub</a>.</p>
<h2><span style="font-size: x-large;">Autosize text</span></h2>
<p>With the latest release of <a href="https://android-developers.googleblog.com/2025/04/whats-new-in-jetpack-compose-april-25.html" target="_blank">Jetpack Compose 1.8</a>, we added the ability to create text composables that automatically adjust the font size to fit the container’s available size with the new autoSize parameter:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">BasicText(text,
style = MaterialTheme.typography.titleLarge,
autoSize = TextAutoSize.StepBased(maxFontSize = <span style="color: #666666">220.</span>sp),
)
</pre></div>
<p>This is used front and center for the “Customize your own Android Bot” text:</p>
<br/>
<p>This text composable is interesting because it needed to have the fun dancing Android bot in the middle of the text. To do this, we use <span style="color: #0d904f ; font-family: courier ;">InlineContent</span>, which allows us to append a composable in the middle of the text composable itself:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">@Composable
<span style="color: #008000; font-weight: bold">private</span> <span style="color: #008000; font-weight: bold">fun</span> <span style="color: #0000FF">DancingBotHeadlineText</span>(modifier: Modifier = Modifier) {
Box(modifier = modifier) {
<span style="color: #008000; font-weight: bold">val</span> animatedBot = <span style="color: #BA2121">"animatedBot"</span>
<span style="color: #008000; font-weight: bold">val</span> text = buildAnnotatedString {
append(stringResource(R.string.customize))
<span style="color: #408080; font-style: italic">// Attach "animatedBot" annotation on the placeholder</span>
appendInlineContent(animatedBot)
append(stringResource(R.string.android_bot))
}
<span style="color: #008000; font-weight: bold">var</span> placeHolderSize by remember {
mutableStateOf(<span style="color: #666666">220.</span>sp)
}
<span style="color: #008000; font-weight: bold">val</span> inlineContent = mapOf(
Pair(
animatedBot,
InlineTextContent(
Placeholder(
width = placeHolderSize,
height = placeHolderSize,
placeholderVerticalAlign = PlaceholderVerticalAlign.TextCenter,
),
) {
DancingBot(
modifier = Modifier
.padding(top = <span style="color: #666666">32.d</span>p)
.fillMaxSize(),
)
},
),
)
BasicText(
text,
modifier = Modifier
.align(Alignment.Center)
.padding(bottom = <span style="color: #666666">64.d</span>p, start = <span style="color: #666666">16.d</span>p, end = <span style="color: #666666">16.d</span>p),
style = MaterialTheme.typography.titleLarge,
autoSize = TextAutoSize.StepBased(maxFontSize = <span style="color: #666666">220.</span>sp),
maxLines = <span style="color: #666666">6</span>,
onTextLayout = { result ->
placeHolderSize = result.layoutInput.style.fontSize * <span style="color: #666666">3.5f</span>
},
inlineContent = inlineContent,
)
}
}
</pre></div>
<h2><span style="font-size: x-large;">Composable visibility with <span style="color: #0d904f ; font-family: courier ;">onLayoutRectChanged</span></span></h2>
<p>With <a href="https://android-developers.googleblog.com/2025/04/whats-new-in-jetpack-compose-april-25.html" target="_blank">Compose 1.8</a>, a new modifier, <span style="font-family: courier ;"><a href="https://developer.android.com/reference/kotlin/androidx/compose/ui/Modifier?hl=en#%28androidx.compose.ui.Modifier%29.onLayoutRectChanged%28kotlin.Long,kotlin.Long,kotlin.Function1%29" target="_blank">Modifier.onLayoutRectChanged</a></span>, was added. This modifier is a more performant version of <span style="color: #0d904f ; font-family: courier ;">onGloballyPositioned</span>, and includes features such as debouncing and throttling to make it performant inside lazy layouts.</p>
<p>In Androidify, we’ve used this modifier for the color splash animation. It determines the position where the transition should start from, as we attach it to the “Let’s Go” button:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">var</span> buttonBounds by remember {
mutableStateOf<RelativeLayoutBounds?>(<span style="color: #008000; font-weight: bold">null</span>)
}
<span style="color: #008000; font-weight: bold">var</span> showColorSplash by remember {
mutableStateOf(<span style="color: #008000; font-weight: bold">false</span>)
}
Box(modifier = Modifier.fillMaxSize()) {
PrimaryButton(
buttonText = <span style="color: #BA2121">"Let's Go"</span>,
modifier = Modifier
.align(Alignment.BottomCenter)
.onLayoutRectChanged(
callback = { bounds ->
buttonBounds = bounds
},
),
onClick = {
showColorSplash = <span style="color: #008000; font-weight: bold">true</span>
},
)
}
</pre></div><br/>
<p>We use these bounds as an indication of where to start the color splash animation from.</p>
<br/>
<h2><span style="font-size: x-large;">Learn more delightful details</span></h2>
<p>From fun marquee animations on the results screen, to animated gradient buttons for the AI-powered actions, to the path drawing animation for the loading screen, this app has many delightful touches for you to experience and learn from.</p>
<br/>
<br/>
<br/>
<p>Check out the full codebase at <a href="http://github.com/android/androidify" target="_blank">github.com/android/androidify</a> and learn more about the latest in Compose from using Material 3 Expressive, the new modifiers, auto-sizing text and of course a couple of delightful interactions!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-54255176248456703202025-05-20T10:47:00.000-07:002025-05-20T11:58:38.139-07:00Androidify: How Androidify leverages Gemini, Firebase and ML Kit<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgit06SKK3XzFZSCJhq4ygask3mob-5PrLcFiDgsT7qXwo5gMmrrlufpNXWev3IZjqESamgtrAtMBsUK4ZVA-Zt9iqXr-th0L8cvQK057_xByadvbJWCzm16yCUBove2WCnhHijUqQHqILY1-aJxfap2BRDK_iPs4egL6J40WeYhQH4fRpnCER3kx-66Ic/s1600/O25-BHero-Android-1-Meta.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgit06SKK3XzFZSCJhq4ygask3mob-5PrLcFiDgsT7qXwo5gMmrrlufpNXWev3IZjqESamgtrAtMBsUK4ZVA-Zt9iqXr-th0L8cvQK057_xByadvbJWCzm16yCUBove2WCnhHijUqQHqILY1-aJxfap2BRDK_iPs4egL6J40WeYhQH4fRpnCER3kx-66Ic/s1600/O25-BHero-Android-1-Meta.png" style="display: none;" />
<em>Posted by Thomas Ezan – Developer Relations Engineer, Rebecca Franks – Developer Relations Engineer, and Avneet Singh – Product Manager</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGn8cQLt5yYm4cg6RnPqvgDZ2yj-M1Bd-oqsfJAKyfTlDhX3D2FWDzzN0k4NjEMeR0VdUBOc39Sh9WTTNnuHvnqGAkKB45pGw0fr6Gc71UwdhrBrAQSK_xGvNthqD8kHHNDkDFXf4b1l9KzbfJ2FoFEFL3Cii1xUEGH4XDSgI8zeyEb4lKCvi3wIJC4MI/s1600/O25-BHero-Android-1.png"><img border="0" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEgGn8cQLt5yYm4cg6RnPqvgDZ2yj-M1Bd-oqsfJAKyfTlDhX3D2FWDzzN0k4NjEMeR0VdUBOc39Sh9WTTNnuHvnqGAkKB45pGw0fr6Gc71UwdhrBrAQSK_xGvNthqD8kHHNDkDFXf4b1l9KzbfJ2FoFEFL3Cii1xUEGH4XDSgI8zeyEb4lKCvi3wIJC4MI/s1600/O25-BHero-Android-1.png" /></a>
<p>We’re bringing back Androidify later this year, this time powered by Google AI, so you can customize your very own Android bot and share your creativity with the world. Today, we’re releasing a new <a href="http://github.com/android/androidify" target="_blank">open source demo app for Androidify</a> as a great example of how Google is using its Gemini AI models to enhance app experiences.</p>
<p>In this post, we'll dive into how the Androidify app uses Gemini models and Imagen via the <a href="https://firebase.google.com/docs/vertex-ai/get-started?platform=android" target="_blank">Firebase AI Logic SDK</a>, and we'll provide some insights learned along the way to help you incorporate Gemini and AI into your own projects. Read more about the <a href="https://android-developers.googleblog.com/2025/05/androidify-building-ai-driven-experiences-jetpack-compose-gemini-camerax.html" target="_blank">Androidify demo app</a>.</p>
<h2><span style="font-size: x-large;">App flow</span></h2>
<p>The overall app functions as follows, with various parts of it using Gemini and Firebase along the way:</p>
<br />
<h3><span style="font-size: large;">Gemini and image validation</span></h3>
<p>To get started with Androidify, take a photo or choose an image on your device. The app needs to make sure that the image you upload is suitable for creating an avatar.</p>
<p>Gemini 2.5 Flash via Firebase helps with this by verifying that the image contains a person, that the person is in focus, and assessing image safety, including whether the image contains abusive content.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> jsonSchema = Schema.obj(
properties = mapOf(<span style="color: #BA2121">"success"</span> to Schema.boolean(), <span style="color: #BA2121">"error"</span> to Schema.string()),
optionalProperties = listOf(<span style="color: #BA2121">"error"</span>),
)
<span style="color: #008000; font-weight: bold">val</span> generativeModel = Firebase.ai(backend = GenerativeBackend.googleAI())
.generativeModel(
modelName = <span style="color: #BA2121">"gemini-2.5-flash-preview-04-17"</span>,
generationConfig = generationConfig {
responseMimeType = <span style="color: #BA2121">"application/json"</span>
responseSchema = jsonSchema
},
safetySettings = listOf(
SafetySetting(HarmCategory.HARASSMENT, HarmBlockThreshold.LOW_AND_ABOVE),
SafetySetting(HarmCategory.HATE_SPEECH, HarmBlockThreshold.LOW_AND_ABOVE),
SafetySetting(HarmCategory.SEXUALLY_EXPLICIT, HarmBlockThreshold.LOW_AND_ABOVE),
SafetySetting(HarmCategory.DANGEROUS_CONTENT, HarmBlockThreshold.LOW_AND_ABOVE),
SafetySetting(HarmCategory.CIVIC_INTEGRITY, HarmBlockThreshold.LOW_AND_ABOVE),
),
)
<span style="color: #008000; font-weight: bold">val</span> response = generativeModel.generateContent(
content {
text(<span style="color: #BA2121">"You are to analyze the provided image and determine if it is acceptable and appropriate based on specific criteria.... (more details see the full sample)"</span>)
image(image)
},
)
<span style="color: #008000; font-weight: bold">val</span> jsonResponse = Json.parseToJsonElement(response.text)
<span style="color: #008000; font-weight: bold">val</span> isSuccess = jsonResponse.jsonObject[<span style="color: #BA2121">"success"</span>]?.jsonPrimitive?.booleanOrNull == <span style="color: #008000; font-weight: bold">true</span>
<span style="color: #008000; font-weight: bold">val</span> error = jsonResponse.jsonObject[<span style="color: #BA2121">"error"</span>]?.jsonPrimitive?.content
</pre></div>
<p>In the snippet above, we’re leveraging <a href="https://firebase.google.com/docs/vertex-ai/structured-output?platform=android" target="_blank">structured output</a> capabilities of the model by defining the schema of the response. We’re passing a <span style="color: #0d904f ; font-family: courier ;"><b>Schema</b></span> object via the <span style="color: #0d904f ; font-family: courier ;"><b>responseSchema</b></span> param in the <span style="color: #0d904f ; font-family: courier ;"><b>generationConfig</b></span>.</p>
<p>We want to validate that the image has enough information to generate a nice Android avatar. So we ask the model to return a json object with <span style="color: #0d904f ; font-family: courier ;"><b>success = true/false</b></span> and an optional <span style="color: #0d904f ; font-family: courier ;"><b>error</b></span> message explaining why the image doesn't have enough information.</p>
<p>Structured output is a powerful feature enabling a smoother integration of LLMs to your app by controlling the format of their output, similar to an API response.</p>
<h3><span style="font-size: large;">Image captioning with Gemini Flash</span></h3>
<p>Once it's established that the image contains sufficient information to generate an Android avatar, it is captioned using Gemini 2.5 Flash with structured output.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> jsonSchema = Schema.obj(
properties = mapOf(
<span style="color: #BA2121">"success"</span> to Schema.boolean(),
<span style="color: #BA2121">"user_description"</span> to Schema.string(),
),
optionalProperties = listOf(<span style="color: #BA2121">"user_description"</span>),
)
<span style="color: #008000; font-weight: bold">val</span> generativeModel = createGenerativeTextModel(jsonSchema)
<span style="color: #008000; font-weight: bold">val</span> prompt = <span style="color: #BA2121">"You are to create a VERY detailed description of the main person in the given image. This description will be translated into a prompt for a generative image model..."</span>
<span style="color: #008000; font-weight: bold">val</span> response = generativeModel.generateContent(
content {
text(prompt)
image(image)
})
<span style="color: #008000; font-weight: bold">val</span> jsonResponse = Json.parseToJsonElement(response.text!!)
<span style="color: #008000; font-weight: bold">val</span> isSuccess = jsonResponse.jsonObject[<span style="color: #BA2121">"success"</span>]?.jsonPrimitive?.booleanOrNull == <span style="color: #008000; font-weight: bold">true</span>
<span style="color: #008000; font-weight: bold">val</span> userDescription = jsonResponse.jsonObject[<span style="color: #BA2121">"user_description"</span>]?.jsonPrimitive?.content
</pre></div>
<p>The other option in the app is to start with a text prompt. You can enter in details about your accessories, hairstyle, and clothing, and let Imagen be a bit more creative.</p>
<h3><span style="font-size: large;">Android generation via Imagen</span></h3>
<p>We’ll use this detailed description of your image to enrich the prompt used for image generation. We’ll add extra details around what we would like to generate and include the bot color selection as part of this too, including the skin tone selected by the user.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> imagenPrompt = <span style="color: #BA2121">"A 3D rendered cartoonish Android mascot in a photorealistic style, the pose is relaxed and straightforward, facing directly forward [...] The bot looks as follows $userDescription [...]"</span>
</pre></div>
<p>We then call the Imagen model to create the bot. Using this new prompt, we create a model and call <span style="color: #0d904f ; font-family: courier ;">generateImages</span>:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #408080; font-style: italic">// we supply our own fine-tuned model here but you can use "imagen-3.0-generate-002" </span>
<span style="color: #008000; font-weight: bold">val</span> generativeModel = Firebase.ai(backend = GenerativeBackend.googleAI()).imagenModel(
<span style="color: #BA2121">"imagen-3.0-generate-002"</span>,
safetySettings =
ImagenSafetySettings(
ImagenSafetyFilterLevel.BLOCK_LOW_AND_ABOVE,
personFilterLevel = ImagenPersonFilterLevel.ALLOW_ALL,
),
)
<span style="color: #008000; font-weight: bold">val</span> response = generativeModel.generateImages(imagenPrompt)
<span style="color: #008000; font-weight: bold">val</span> image = response.images.first().asBitmap()
</pre></div>
<p>And that’s it! The Imagen model generates a bitmap that we can display on the user’s screen.</p>
<h3><span style="font-size: large;">Finetuning the Imagen model</span></h3>
<p>The Imagen 3 model was finetuned using <a href="https://arxiv.org/abs/2106.09685" target="_blank">Low-Rank Adaptation (LoRA)</a>. LoRA is a fine-tuning technique designed to reduce the computational burden of training large models. Instead of updating the entire model, LoRA adds smaller, trainable "adapters" that make small changes to the model's performance. We ran a fine tuning pipeline on the Imagen 3 model generally available with Android bot assets of different color combinations and different assets for enhanced cuteness and fun. We generated text captions for the training images and the image-text pairs were used to finetune the model effectively.</p>
<p>The current <a href="http://github.com/android/androidify" target="_blank">sample app</a> uses a standard Imagen model, so the results may look a bit different from the visuals in this post. However, the app using the fine-tuned model and a custom version of Firebase AI Logic SDK was demoed at Google I/O. This app will be released later this year and we are also planning on adding support for fine-tuned models to Firebase AI Logic SDK later in the year.</p>
<br />
<h2><span style="font-size: x-large;">ML Kit</span></h2>
<p>The app also uses the <a href="https://developers.google.com/ml-kit/vision/pose-detection" target="_blank">ML Kit Pose Detection SDK</a> to detect a person in the camera view, which triggers the capture button and adds visual indicators.</p>
<p>To do this, we add the SDK to the app, and use <span style="color: #0d904f; font-family: courier">PoseDetection.getClient()</span>. Then, using the <span style="color: #0d904f; font-family: courier">poseDetector</span>, we look at the <span style="color: #0d904f; font-family: courier">detectedLandmarks</span> that are in the streaming image coming from the Camera, and we set the <span style="color: #0d904f; font-family: courier">_uiState.detectedPose</span> to true if a nose and shoulders are visible:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">private</span> suspend <span style="color: #008000; font-weight: bold">fun</span> <span style="color: #0000FF">runPoseDetection</span>() {
PoseDetection.getClient(
PoseDetectorOptions.Builder()
.setDetectorMode(PoseDetectorOptions.STREAM_MODE)
.build(),
).use { poseDetector ->
<span style="color: #408080; font-style: italic">// Since image analysis is processed by ML Kit asynchronously in its own thread pool,</span>
<span style="color: #408080; font-style: italic">// we can run this directly from the calling coroutine scope instead of pushing this</span>
<span style="color: #408080; font-style: italic">// work to a background dispatcher.</span>
cameraImageAnalysisUseCase.analyze { imageProxy ->
imageProxy.image?.let { image ->
<span style="color: #008000; font-weight: bold">val</span> poseDetected = poseDetector.detectPersonInFrame(image, imageProxy.imageInfo)
_uiState.update { it.copy(detectedPose = poseDetected) }
}
}
}
}
<span style="color: #008000; font-weight: bold">private</span> suspend <span style="color: #008000; font-weight: bold">fun</span> PoseDetector.detectPersonInFrame(
image: Image,
imageInfo: ImageInfo,
): Boolean {
<span style="color: #008000; font-weight: bold">val</span> results = process(InputImage.fromMediaImage(image, imageInfo.rotationDegrees)).await()
<span style="color: #008000; font-weight: bold">val</span> landmarkResults = results.allPoseLandmarks
<span style="color: #008000; font-weight: bold">val</span> detectedLandmarks = mutableListOf<Int>()
<span style="color: #008000; font-weight: bold">for</span> (landmark <span style="color: #008000; font-weight: bold">in</span> landmarkResults) {
<span style="color: #008000; font-weight: bold">if</span> (landmark.inFrameLikelihood > <span style="color: #666666">0.7</span>) {
detectedLandmarks.add(landmark.landmarkType)
}
}
<span style="color: #008000; font-weight: bold">return</span> detectedLandmarks.containsAll(
listOf(PoseLandmark.NOSE, PoseLandmark.LEFT_SHOULDER, PoseLandmark.RIGHT_SHOULDER),
)
}
</pre></div>
<br />
<h2><span style="font-size: x-large;">Get started with AI on Android</span></h2>
<p>The Androidify app makes an extensive use of the Gemini 2.5 Flash to validate the image and generate a detailed description used to generate the image. It also leverages the specifically fine-tuned Imagen 3 model to generate images of Android bots. Gemini and Imagen models are easily integrated into the app via the Firebase AI Logic SDK. In addition, ML Kit Pose Detection SDK controls the capture button, enabling it only when a person is present in front of the camera.</p>
<p>To get started with AI on Android, go to the <a href="https://developer.android.com/ai/gemini" target="_blank">Gemini</a> and <a href="http://developer.android.com/ai/imagen" target="_blank">Imagen</a> documentation for Android.</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-36537568918179400062025-05-20T10:46:00.001-07:002025-05-20T11:58:21.831-07:00Android Design at Google I/O 2025<meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRMeTkm4RmhaXfpKoI_OheJiHhcSWexGWxlLcNrQ3Zd4FoAjCZCiOyvuW-X7oB1FsJMwE_ZJcMWwfccKgxwKJDo1jm2VnNKcFBo1KnIyEysGhyphenhyphenHYcTdoamvgJr87PQHGLbqycXb7UOFjHj7FzEFo4G8CUrPUPisqjTtzVnm38Ofnoo0CMXWVjzR1q4W5w/s1600/android-design-google-io-2025.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiRMeTkm4RmhaXfpKoI_OheJiHhcSWexGWxlLcNrQ3Zd4FoAjCZCiOyvuW-X7oB1FsJMwE_ZJcMWwfccKgxwKJDo1jm2VnNKcFBo1KnIyEysGhyphenhyphenHYcTdoamvgJr87PQHGLbqycXb7UOFjHj7FzEFo4G8CUrPUPisqjTtzVnm38Ofnoo0CMXWVjzR1q4W5w/s1600/android-design-google-io-2025.png">
<em>Posted by Ivy Knight – Senior Design Advocate</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi18gLLexq6txZXoiWtCEPrNZ1X2Wbgs78zFBz9HpTIDx5fSfJeKJCqrSyyKzj7gEE9C5_5AXQY7oexSEq5cNaieUvQPEegLnN-tunLnf2pm32X_2nJbba9IyYPl_txkmEViX9CArDVLHsn2sF_2FC7O55aq6b7R4qbROXG0u0vINlsgM3vr8hhU_EhPz8/s1600/android-design-google-io-2025-banner.png" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi18gLLexq6txZXoiWtCEPrNZ1X2Wbgs78zFBz9HpTIDx5fSfJeKJCqrSyyKzj7gEE9C5_5AXQY7oexSEq5cNaieUvQPEegLnN-tunLnf2pm32X_2nJbba9IyYPl_txkmEViX9CArDVLHsn2sF_2FC7O55aq6b7R4qbROXG0u0vINlsgM3vr8hhU_EhPz8/s1600/android-design-google-io-2025-banner.png" data-original-width="100%" data-original-height="800" /></a>
<div><br/></div>
<p>Here’s your guide to the essential Android Design sessions, resources, and announcements for I/O ‘25:</p>
<h2><span style="font-size: x-large;">Check out the latest Android updates</span></h2>
<h3><span style="font-size: large;"><a href="https://www.android.com/new-features-on-android/io-2025/" target="_blank">The Android Show: I/O Edition</a></span></h3>
<p>The Android Show had a special I/O edition this year with some exciting announcements like Material Expressive!</p>
<p>Learn more about the new Live Update Notification templates in the <a href="https://io.google/2025/explore/technical-session-53" target="_blank">Android Notifications & Live Updates</a> for an in-depth look at what they are, when to use them, and why. You can also get the Live Update design template in the <a href="http://figma.com/@androiddesign" target="_blank">Android UI Kit</a>, read more in the updated Notification <a href="https://developer.android.com/design/ui/mobile/guides/home-screen/notifications" target="_blank">guidance</a>, and get hands-on with the <a href="https://goo.gle/jetsnacks-figma" target="_blank">Jetsnack Live Updates and Widget case study</a>.</p>
<h2><span style="font-size: x-large;">Make your apps more expressive</span></h2>
<p>Get a jump on the future of Google’s UX design: Material 3 Expressive. Learn how to use new emotional design patterns to boost engagement, usability, and desire for your product in the <b><a href="https://io.google/2025/explore/technical-session-24" target="_blank">Build Next-Level UX with Material 3 Expressive</a></b> session and check out the expressive update on <a href="http://Material.io" target="_blank">Material.io</a>.</p>
<p>Stay up to date with <b><a href="https://io.google/2025/explore/technical-session-8" target="_blank">Android Accessibility Updates</a></b>, highlighting accessibility features launching with Android 16: enhanced dark themes, options for those with motion sickness, a new way to increase text contrast, and more.</p>
<p>Catch the <b><a href="https://io.google/2025/explore/technical-session-16" target="_blank">Mastering text input in Compose</a></b> session to learn more about how engaging robust text experiences are built with Jetpack Compose. It covers Autofill integration, dynamic text resizing, and custom input transformations. This is a great session to watch to see what’s possible when designing text inputs.</p>
<h2><span style="font-size: x-large;">Thinking across form factors</span></h2>
<p>These design resources and sessions can help you design across more Android form factors or update your existing experiences.</p>
<p>Preview Gemini in-car, imagining seamless navigation and personalized entertainment in the <b><a href="https://io.google/2025/explore/technical-session-18" target="_blank">New In-Car App Experiences</a></b> session. Then explore the new <b><a href="https://goo.gle/figma-car-app-design-kit" target="_blank">Car UI Design Kit</a></b> to bring your app to Android Car platforms and speed up your process with the latest Android form factor kit.</p>
<p><b><a href="https://io.google/2025/explore/technical-session-12" target="_blank">Engaging with users on Google TV with excellent TV apps</a></b> session discusses new ways the Google TV experience is making it easier for users to find and engage with content, including improvement to out-of-box solutions and updates to Android TV OS.</p>
<p>Want a peek at how to bring immersive content, like 3D models, to Android XR with the <b><a href="https://io.google/2025/explore/technical-session-22" target="_blank">Building differentiated apps for Android XR with 3D Content</a></b> session.</p>
<p>Plus WearOS is releasing an updated design kit <a href="http://figma.com/@androiddesign" target="_blank">@AndroidDesign Figma</a> and learning <a href="https://developer.android.com/courses/pathways/wear" target="_blank">Pathway</a>.</p>
<h2><span style="font-size: x-large;">Tip top apps</span></h2>
<p>We’ve also released the following new Android design guidance to help you design the best Android experiences:</p>
<b><a href="https://developer.android.com/design/ui/mobile/guides/patterns/settings" target="_blank">In-app Settings</a></b>
<p>Read up on the latest suggested patterns to build out your app’s settings.</p>
<b><a href="https://developer.android.com/design/ui/mobile/guides/patterns/help-content" target="_blank">Help and Feedback</a></b>
<p>Along with settings, learn about adding help and feedback to your app.</p>
<b><a href="https://developer.android.com/design/ui/mobile/guides/widgets/configuration" target="_blank">Widget Configuration</a></b>
<p>Does your app need setup? New guidance to help guide in adding configuration to your app’s widgets.</p>
<b><a href="https://developer.android.com/design/ui/mobile/guides/layout-and-content/edge-to-edge" target="_blank">Edge-to-edge design</a></b>
<p>Allow your apps to take full advantage of the entire screen with the latest guidance on designing for edge-to-edge.</p>
<p><b>Check out <a href="http://figma.com/@androiddesign" target="_blank">figma.com/@androiddesign</a> for even more new and updated resources.</b></p>
<p>Visit the I/O 2025 website, <a href="https://io.google/2025/explore?focus_areas=Android" target="_blank">build your schedule</a>, and engage with the community. If you are at the Shoreline come say hello to us in the <b>Android tent</b> at our booths.</p>
<p>We can't wait to see what you create with these new tools and insights. Happy I/O!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br/>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjlG1P_Gj81pD1M9aqDLPMFMX-WVod57rjMj-Gi7jd2fG9D7WY50kOpA8ckv9hVkwek5GExy7PIS2Ktvyv_ltOECyjSyWJP__7k1E7t89PpqPdlbG8G3H24eG2dFjoy2nzlywU4-Led3PKeMOatCD70W-L_Qek2e62gk0N8K4AMsDtHqbaRw7Tgtd2pFNA/s1600/image1.png" imageanchor="1" ><img style="100%" border="0" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEjlG1P_Gj81pD1M9aqDLPMFMX-WVod57rjMj-Gi7jd2fG9D7WY50kOpA8ckv9hVkwek5GExy7PIS2Ktvyv_ltOECyjSyWJP__7k1E7t89PpqPdlbG8G3H24eG2dFjoy2nzlywU4-Led3PKeMOatCD70W-L_Qek2e62gk0N8K4AMsDtHqbaRw7Tgtd2pFNA/s1600/image1.png" data-original-width="100%" data-original-height="800" /></a>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-52589779977263955562025-05-20T10:45:00.000-07:002025-05-22T09:16:30.564-07:00 Google I/O 2025: Build adaptive Android apps that shine across form factors<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_3jPnuulaOTdAqI2KLkndnyzkBdp7S_uJw0sNQ9mm6EbmBfmc_hDu6SKHfIcOHrIvDKQr_ElUNIu0_4a9f3x_mDBiWfb91HoFYVg815YzFJT6OaptfLBYUA7cNKB7SGxt4kidhTPKTlryC_yBvfolh8O9ZMlWvRywe2R27qvwYN93ouXxQrYkQ73TCik/s1600/adaptive-collage-google-io.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg_3jPnuulaOTdAqI2KLkndnyzkBdp7S_uJw0sNQ9mm6EbmBfmc_hDu6SKHfIcOHrIvDKQr_ElUNIu0_4a9f3x_mDBiWfb91HoFYVg815YzFJT6OaptfLBYUA7cNKB7SGxt4kidhTPKTlryC_yBvfolh8O9ZMlWvRywe2R27qvwYN93ouXxQrYkQ73TCik/s1600/adaptive-collage-google-io.png" style="display: none;" />
<em>Posted by Fahd Imtiaz – Product Manager, Android Developer</em>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="15oPNK1W0Tw" width="100%" height="413" src="https://www.youtube.com/embed/15oPNK1W0Tw"></iframe>
<p>If your app isn’t built to adapt, you’re missing out on the opportunity to reach a giant swath of users across 500 million devices! At Google I/O this year, we are exploring how adaptive development isn’t just a good idea, but essential to building apps that shine across the expanding Android device ecosystem. This is your guide to meeting users wherever they are, with experiences that are perfectly tailored to their needs.</p>
<h2><span style="font-size: x-large;">The advantage of building adaptive</span></h2>
<p>In today's multi-device world, users expect their favorite applications to work flawlessly and intuitively, whether they're on a smartphone, tablet, or Chromebook. This expectation for seamless experiences isn't just about convenience; it's an important factor for user engagement and retention.</p>
<p>For example, entertainment apps (including Prime Video, Netflix, and Hulu) users on both phone and tablet spend almost 200% more time in-app (nearly 3x engagement) than phone-only users in the US<sup>*</sup>.</p>
<p><a href="https://android-developers.googleblog.com/2025/05/peacock-optimizes-streaming-jetpack-compose.html" target="_blank">Peacock, NBCUniversal’s streaming service</a> has seen a trend of users moving between mobile and large screens and building adaptively enables a single build to work across the different form factors.</p>
<blockquote><i>“This allows Peacock to have more time to innovate faster and deliver more value to its customers.”</i><div><b>– Diego Valente, Head of Mobile, Peacock and Global Streaming</b></div></blockquote>
<p>Adaptive Android development offers the strategic solution, enabling apps to perform effectively across an expanding array of devices and contexts through intelligent design choices that emphasize code reuse and scalability. With Android's continuous growth into new form factors and upcoming enhancements such as desktop windowing and connected displays in Android 16, an app's ability to seamlessly adapt to different screen sizes is becoming increasingly crucial for retaining users and staying competitive.</p>
<p>Beyond direct user benefits, designing adaptively also translates to increased visibility. The Google Play Store actively helps promote developers whose apps excel on different form factors. If your application delivers a great experience on tablets or is excellent on ChromeOS, users on those devices will have an easier time discovering your app. This creates a win-win situation: better quality apps for users and a broader audience for you.</p>
<br />
<h2><span style="font-size: x-large;">Latest in adaptive Android development from Google I/O</span></h2>
<p>To help you more effectively build compelling adaptive experiences, we shared several key updates at I/O this year.</p>
<h3><span style="font-size: large;">Build for the expanding Android device ecosystem</span></h3>
<p>Your mobile apps can now reach users beyond phones on over <b>500 million</b> active devices, including foldables, tablets, Chromebooks, and even compatible cars, with minimal changes. Android 16 introduces significant advancements in desktop windowing for a true desktop-like experience on large screens and when devices are connected to external displays. And, Android XR is opening a new dimension, allowing your existing mobile apps to be available in immersive virtual environments.</p>
<h3><span style="font-size: large;">The mindset shift to Adaptive</span></h3>
<p>With the expanding Android device ecosystem, adaptive app development is a fundamental strategy. It's about how the same mobile app runs well across phones, foldables, tablets, Chromebooks, connected displays, XR, and cars, laying a strong foundation for future devices and differentiating for specific form factors. You don't need to rebuild your app for each form factor; but rather make small, iterative changes, as needed, when needed. Embracing this adaptive mindset today isn't just about keeping pace; it's about leading the charge in delivering exceptional user experiences across the entire Android ecosystem.</p>
<br />
<h3><span style="font-size: large;">Leverage powerful tools and libraries to build adaptive apps:</span></h3>
<ul><ul>
<li><b><a href="https://developer.android.com/develop/ui/compose/build-adaptive-apps#compose_material_3_adaptive" target="_blank">Compose Adaptive Layouts library</a></b>: This library makes adaptive development easier by allowing your app code to fit into canonical layout patterns like list-detail and supporting pane, that automatically reflow as your app is resized, flipped or folded. In the 1.1 release, we introduced pane expansion, allowing users to resize panes. The Socialite demo app showcased how one codebase using this library can adapt across six form factors. New adaptation strategies like "Levitate" (elevating a pane, e.g., into a dialog or bottom sheet) and "Reflow" (reorganizing panes on the same level) were also announced in 1.2 (alpha). For XR, component overrides can automatically spatialize UI elements.</li></ul><br/><ul>
<li><b><a href="http://goo.gle/nav3" target="_blank">Jetpack Navigation 3 (Alpha)</a></b>: This new navigation library simplifies defining user journeys across screens with less boilerplate code, especially for multi-pane layouts in Compose. It helps handle scenarios where list and detail panes might be separate destinations on smaller screens but shown together on larger ones. Check out the new Jetpack Navigation library in alpha.</li></ul><br/><ul>
<li><b><a href="https://developer.android.com/develop/ui/compose/touch-input" target="_blank">Jetpack Compose input enhancements</a></b>: Compose's layered architecture, strong input support, and single location for layout logic simplify creating adaptive UIs. Upcoming in Compose 1.9 are right-click context menus and enhanced trackpad/mouse functionality.</li></ul><br/><ul>
<li><b><a href="https://developer.android.com/develop/ui/compose/layouts/adaptive/use-window-size-classes" target="_blank">Window Size Classes</a></b>: Use window size classes for top-level layout decisions. AndroidX.window 1.5 introduces two new width size classes – "large" (1200dp to 1600dp) and "extra-large" (1600dp and larger) – providing more granular breakpoints for large screens. This helps in deciding when to expand navigation rails or show three panes of content. Support for these new breakpoints was also announced in the Compose adaptive layouts library 1.2 alpha, along with <a href="https://m3.material.io/foundations/layout/applying-layout/window-size-classes" target="_blank">design guidance</a>.</li></ul><br/><ul>
<li><b><a href="https://developer.android.com/develop/ui/compose/tooling/previews" target="_blank">Compose previews</a></b>: Get quick feedback by visualizing your layouts across a wide variety of screen sizes and aspect ratios. You can also specify different devices by name to preview your UI on their respective sizes and with their inset values.</li></ul><br/><ul>
<li><b><a href="https://developer.android.com/training/testing/different-screens" target="_blank">Testing adaptive layouts</a></b>: Validating your adaptive layouts is crucial and Android Studio offers various tools for testing – including previews for different sizes and aspect ratios, a resizable emulator to test across different screen sizes with a single AVD, screenshot tests, and instrumental behavior tests. And with Journeys with Gemini in Android Studio, you can define tests using natural language for even more robust testing across different window sizes.</li>
</ul></ul>
<h3><span style="font-size: large;">Ensuring app availability across devices</span></h3>
<p>Avoid <a href="https://android-developers.googleblog.com/2023/12/increase-your-apps-availability-across-device-types.html" target="_blank">unnecessarily declaring required features</a> (like specific cameras or GPS) in your manifest, as this can prevent your app from appearing in the Play Store on devices that lack those specific hardware components but could otherwise run your app perfectly.</p>
<h3><span style="font-size: large;">Handling different input methods</span></h3>
<p>Remember to <a href="https://developer.android.com/develop/ui/compose/touch-input/input-compatibility-on-large-screens" target="_blank">handle various input methods</a> like touch, keyboard, and mouse, especially with Chromebook detachables and connected displays.</p>
<h3><span style="font-size: large;">Prepare for orientation and resizability API changes in Android 16</span></h3>
<p><a href="https://android-developers.googleblog.com/2025/01/orientation-and-resizability-changes-in-android-16.html" target="_blank">Beginning in Android 16</a>, for apps targeting SDK 36, manifest and runtime restrictions on orientation, resizability, and aspect ratio will be ignored on displays that are at least 600dp in both dimensions. To meet user expectations, your apps will need layouts that work for both portrait and landscape windows, and support resizing at runtime. There's a temporary opt-out manifest flag at both the application and activity level to delay these changes until targetSdk 37, and these changes currently do not apply to apps categorized as "Games". Learn more about these <a href="https://developer.android.com/about/versions/16/behavior-changes-16#adaptive-layouts" target="_blank">API changes</a>.</p>
<h3><span style="font-size: large;">Adaptive considerations for games</span></h3>
<p><a href="https://developer.android.com/games/develop/multiplatform/overview" target="_blank">Games need to be adaptive too</a> and Unity 6 will add enhanced support for configuration handling, including APIs for screenshots, aspect ratio, and density. Success stories like Asphalt Legends Unite show significant user retention increases on foldables after implementing adaptive features.</p>
<br />
<h2><span style="font-size: x-large;">Start building adaptive today</span></h2>
<p>Now is the time to elevate your Android apps, making them intuitively responsive across form factors. With the latest tools and updates we’re introducing, you have the power to build experiences that seamlessly flow across all devices, from foldables to cars and beyond. Implementing these strategies will allow you to expand your reach and delight users across the Android ecosystem.</p>
<p>Get inspired by the “<a href="https://youtu.be/15oPNK1W0Tw" target="_blank">Adaptive Android development makes your app shine across devices</a>” talk, and explore all the resources you’ll need to start your journey at <a href="http://developer.android.com/adaptive-apps" target="_blank">developer.android.com/adaptive-apps</a>!</p>
<p>Explore this announcement and all Google I/O 2025 updates on <a href="https://io.google/2025/?utm_source=blogpost&utm_medium=pr&utm_campaign=event&utm_content=" target="_blank">io.google</a> starting May 22.</p><br/>
<small><sup>*</sup><i>Source: internal Google data</i></small>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-63847069581696520332025-05-13T10:00:00.000-07:002025-05-13T13:07:55.005-07:00The Android Show: I/O Edition - what Android devs need to know! <meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivjcWPthi-WHxcwoy7tZK8O4CVv66U55HhVtEzQJedml2pY3xEjX-C8CbtSiB-vZywGNEl05lAXSxkfoo5WBNfXtxabZ2RRNs8vD0IBDoCQfLKBmSaZTkYC8DseoyeklNgP1n8ffvodiQocbhP7Epjpgb162Ydn5lmyNE3PUVJq7l_pjkYB5rtMLbSwqI/s1600/theandroidshow-google-io-2025.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEivjcWPthi-WHxcwoy7tZK8O4CVv66U55HhVtEzQJedml2pY3xEjX-C8CbtSiB-vZywGNEl05lAXSxkfoo5WBNfXtxabZ2RRNs8vD0IBDoCQfLKBmSaZTkYC8DseoyeklNgP1n8ffvodiQocbhP7Epjpgb162Ydn5lmyNE3PUVJq7l_pjkYB5rtMLbSwqI/s1600/theandroidshow-google-io-2025.png">
<em>Posted by Matthew McCullough – Vice President, Product Management, Android Developer</em>
<div><br/></div>
<p>We just dropped an <b>I/O Edition of The Android Show</b>, where we unpacked exciting new experiences coming to the Android ecosystem: a fresh and dynamic look and feel, smarts across your devices, and enhanced safety and security features. <a href="http://android.com/io25" target="_blank">Join Sameer Samat</a>, President of Android Ecosystem, and the Android team to learn about exciting new development in the episode below, and read about all of the <a href="https://blog.google/products/android/the-android-show-io-2025" target="_blank">updates for users</a>.</p>
<p>Tune into <a href="https://io.google/2025/" target="_blank">Google I/O</a> next week – including the <a href="https://io.google/2025/explore/developer-keynote-1" target="_blank">Developer Keynote</a> as well as the <a href="https://io.google/2025/explore?focus_areas=Android" target="_blank">full Android track of sessions</a> – where we’re covering these topics in more detail and how you can get started.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="l3yDd3CmA_Y" width="100%" height="498" src="https://www.youtube.com/embed/l3yDd3CmA_Y"></iframe><br/>
<h2><span style="font-size: x-large ;">Start building with Material 3 Expressive</span></h2>
<p>The world of UX design is constantly evolving, and you deserve the tools to create truly engaging and impactful experiences. That’s why Material Design’s latest evolution, <b>Material 3 Expressive</b>, provides new ways to make your product more engaging, easy to use, and desirable. Learn more, and try out the new <a href="https://m3.material.io/blog/building-with-m3-expressive" target="_blank">Material 3 Expressive</a>: an expansion pack designed to enhance your app’s appeal by harnessing emotional UX, making it more engaging, intuitive, and desirable for users. It comes with new components, motion-physics system, type styles, colors, shapes and more.</p>
<p>Material 3 Expressive will be coming to Android 16 later this year; check out the <a href="https://io.google/2025/explore/technical-session-24" target="_blank">Google I/O talk</a> next week where we’ll dive into this in more detail.</p>
<h2><span style="font-size: x-large ;">A fluid design built for your watch's round display</span></h2>
<p><b>Wear OS 6</b>, <a href="http://blog.google/products/android/material-3-expressive-android-wearos-launch" target="_blank">arriving later this year</a>, brings Material 3 Expressive design to Google’s smartwatch platform. New design language puts the round watch display at the heart of the experience, and is embraced in every single component and motion of the System, from buttons to notifications. You'll be able to try new visual design and upgrade existing app experiences to a new level. Next week, tune in to the <a href="https://io.google/2025/explore/pa-keynote-7" target="_blank">What’s New in Android</a> session to learn more.</p>
<h2><span style="font-size: x-large ;">Plus some goodies in Android 16...</span></h2>
<p>We also unpacked some of the latest features coming to users in <a href="https://developer.android.com/about/versions/16" target="_blank">Android 16</a>, which we’ve been <a href="https://android-developers.googleblog.com/search?q=Android+16" target="_blank">previewing with you</a> for the last few months. If you haven’t already, you can <a href="https://developer.android.com/about/versions/16/get" target="_blank">try out the latest Beta of Android 16</a>.</p>
<p>A few new features that Android 16 adds which developers should pay attention to are Live updates, professional media and camera features, desktop windowing for tablets, major accessibility enhancements and much more:</p>
<ul><ul>
<li>Live Updates allow your app to show time-sensitive progress updates. Use the new <span style="color: #0d904f ; font-family: courier ;">ProgressStyle</span> template for an improved experience around navigation, deliveries, and rideshares.</li></ul><ul>
<li>Professional media and camera features include <a href="https://developer.android.com/about/versions/16/features#hybrid-auto-exposure" target="_blank">hybrid auto exposure</a>, <a href="https://developer.android.com/about/versions/16/features#color-temperature-tint" target="_blank">precise color temperature and tint adjustments</a>, <a href="https://developer.android.com/about/versions/16/features#night-mode-scene-detection" target="_blank">night mode scene detection</a>, and support for the new <a href="https://developer.android.com/about/versions/16/features#apv" target="_blank">Advanced Professional Video</a> format.</li>
</ul></ul>
<p>Watch the <a href="https://io.google/2025/explore/pa-keynote-7" target="_blank">What’s New in Android</a> session and the <a href="https://io.google/2025/explore/technical-session-53" target="_blank">Live updates</a> talk to learn more.</p>
<h2><span style="font-size: x-large ;">Tune in next week to Google I/O</span></h2>
<p>This was just a preview of some Android-related news, so <a href="https://android-developers.googleblog.com/2025/04/google-io-program-lineup-revealed.html" target="_blank">remember to tune in next week</a> to <a href="https://io.google/2025/" target="_blank">Google I/O</a>, where we’ll be diving into a range of Android developer topics in a lot more detail. You can check out <a href="https://io.google/2025/explore/pa-keynote-7" target="_blank">What’s New in Android</a> and the <a href="https://io.google/2025/explore?focus_areas=Android" target="_blank">full Android track of sessions</a> to start planning your time.</p>
<p>We can’t wait to see you next week, whether you’re joining in person or virtually from anywhere around the world!</p>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-8701344146039622482025-05-13T06:00:00.000-07:002025-05-13T06:00:00.113-07:00#WeArePlay: How My Lovely Planet is making environmental preservation fun through games<meta name="twitter:image" content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0RHGY7A2hPHN1ogd4j_XOugZxQu6dzGpOEUT7HqRV2fnYHY-OVoK0ySu5SazQmM8G73M4Jmha69rvgsbpQDXK8hgnWHJPDatBuqlEFwAeR4UQfzua5tlcmoRnVj1jBIxknxV8Auh63syw9I69VuaEZ2l8E-oo6fc53UsKgXf0ZqIkk8FoBmtcE36UGiA/s1600/weareplay-my-lovely-planet-environmental-preservation-game.png">
<img style="display:none" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh0RHGY7A2hPHN1ogd4j_XOugZxQu6dzGpOEUT7HqRV2fnYHY-OVoK0ySu5SazQmM8G73M4Jmha69rvgsbpQDXK8hgnWHJPDatBuqlEFwAeR4UQfzua5tlcmoRnVj1jBIxknxV8Auh63syw9I69VuaEZ2l8E-oo6fc53UsKgXf0ZqIkk8FoBmtcE36UGiA/s1600/weareplay-my-lovely-planet-environmental-preservation-game.png">
<em>Posted by Robbie McLachlan – Developer Marketing</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiG90W4G7Yb0XMHaWTPrbSy5cg6PhcBIliHz5Ee9QmPeqnKXCX5fykOqPiv-Jx3zJM-dllFvnWqmo6VKSleVQmcrrf_WKO6og7DkTy4O5c8UajOHB_XHwSwr_VLHyHBsMRKQOhNlvSmzmgbvXUnkaBJA1CbmpIxf-uBRU1xINUDHPqgsjucrqEh5ImJQqw/s1600/WeArePlay_GlobalGrid_2024.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEiG90W4G7Yb0XMHaWTPrbSy5cg6PhcBIliHz5Ee9QmPeqnKXCX5fykOqPiv-Jx3zJM-dllFvnWqmo6VKSleVQmcrrf_WKO6og7DkTy4O5c8UajOHB_XHwSwr_VLHyHBsMRKQOhNlvSmzmgbvXUnkaBJA1CbmpIxf-uBRU1xINUDHPqgsjucrqEh5ImJQqw/s1600/WeArePlay_GlobalGrid_2024.png" /></a>
<div><br/></div>
<p>In our latest <a href="https://play.google.com/console/about/weareplay/" target="_blank">#WeArePlay</a> film, which celebrates the people behind apps and games on Google Play, we meet Clément, the founder of Imagine Games. His game, <a href="https://play.google.com/store/apps/details?id=com.mylovelyplanet.mylovelyforests&gl=fr" target="_blank">My Lovely Planet</a>, turns casual mobile gaming into tangible environmental action, planting real trees and supporting reforestation projects worldwide. Discover the inspiration behind My Lovely Planet and the impact it’s had so far.</p>
<iframe class="BLOG_video_class" allowfullscreen="" youtube-src-id="TrgDdhvzwt4" width="100%" height="413" src="https://www.youtube.com/embed/TrgDdhvzwt4"></iframe><br/>
<h4><span style="font-size: large;">What inspired you to combine gaming with positive environmental impact?</span></h4>
<p>I’ve always loved gaming and believed in technology’s potential to tackle environmental challenges. But it was my time working with an NGO in Madagascar, where I witnessed firsthand the devastating effects of environmental changes that truly sparked my mission. Combining gaming and sustainability just made sense. Billions of people play games, so why not harness that entertainment to create real-world impact? So far, the results speak for themselves: we've built an engaged global community committed to protecting the environment.</p>
<br />
<h4><span style="font-size: large;">How do players in <i>My Lovely Planet</i> make real-world differences through the game?</span></h4>
<p>With My Lovely Planet, planting a tree in the game means planting a real tree in the world. Our community has already planted over 360,000 trees through partnerships with NGOs like Graines de Vie in Madagascar, Kenya, and France. We've also supported ocean-cleaning, bee-protection, and drone reforestation projects.</p>
<p>Balancing fun with impact was key. Players wouldn’t stay just for the mission, so we focused on creating a genuinely fun match-3 style game. Once gameplay was strong, we made real-world actions like tree planting core rewards in the game, helping players feel naturally connected to their impact. Our goal is to keep growing this model to protect biodiversity and fight climate change.</p>
<h4><span style="font-size: large;">Can you tell us about your drone-led reforestation project in France?</span></h4>
<p>Our latest initiative involves using drones to reforest areas severely impacted by insect infestations and other environmental issues. We're dropping over one million specially-coated seeds by drone, which is a completely new and efficient way of reforesting large areas. It’s exciting because if this pilot succeeds, it could be replicated worldwide, significantly boosting global reforestation efforts.</p>
<br />
<h4><span style="font-size: large;">How has Google Play helped your journey?</span></h4>
<p>Google Play has been crucial for My Lovely Planet – it's our main distribution channel, with about 70% of our players coming through the platform. It makes it incredibly easy and convenient for anyone to download and start playing immediately. which is essential for engaging a global community. Plus, from a developer's standpoint, the flexibility, responsiveness, and powerful testing tools Google Play provides have made launching and scaling our game faster and smoother, allowing us to focus even more on our environmental impact.</p>
<br />
<h4><span style="font-size: large;">What is next for <i>My Lovely Planet</i>?</span></h4>
<p>Right now, we're focused on expanding the game experience by adding more engaging levels, and introducing exciting new features like integrating our eco-friendly cryptocurrency, My Lovely Coin, into gameplay. Following the success of our first drone-led reforestation project in France, our next step is tracking its impact and expanding this approach to other regions. Ultimately, we aim to build the world's largest gaming community dedicated to protecting the environment, empowering millions to make a difference while enjoying the game.</p><br/>
<p>Discover other inspiring app and game founders featured in <a href="https://play.google.com/console/about/weareplay/" target="_blank">#WeArePlay</a>.</p>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-43212669040692533042025-05-08T09:00:00.000-07:002025-05-08T09:00:00.223-07:00Prepare your apps for Google Play’s 16 KB page size compatibility requirement<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCWEnw2TTPOlZKXxpWfQ9pJ4bNE_1HLIa62yDkznzRAR_Rc2aMKz3_NqPDB4b-3_zzfb7STVR6ZsgmdNFy56lsvjicCk5O8c9M3qvV61GByphVTbCTyQerylT4pQoNRvfLf1RIvlmpMKpu0DiaRczWD-eUe8O5tPKWZ896lQmS0OjuDqq-wLmLXn7KT-M/s1600/16KB_GooglePlay_Metadata.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhCWEnw2TTPOlZKXxpWfQ9pJ4bNE_1HLIa62yDkznzRAR_Rc2aMKz3_NqPDB4b-3_zzfb7STVR6ZsgmdNFy56lsvjicCk5O8c9M3qvV61GByphVTbCTyQerylT4pQoNRvfLf1RIvlmpMKpu0DiaRczWD-eUe8O5tPKWZ896lQmS0OjuDqq-wLmLXn7KT-M/s1600/16KB_GooglePlay_Metadata.png" style="display: none;" />
<em>Posted by Dan Brown – Product Manager, Google Play</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg893CUiLFGN4At2mbWkP75w1aqCd58B1KgUucgntcTc1nb3-8v4DP8cdem96pWzycMtUYOIwx5UQckgBLdF3wSylmoTm5AOfOmQynCCqtaC9wtPSCEhj7gwTj9sqnnfmkmRWPYK9KSnlo64wkYgerSzDevYA7A9gz6lQhgukX16ZcG2ncobQwTTjNgAJ4/s1600/1KB_GooglePlay_HeroBlog.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEg893CUiLFGN4At2mbWkP75w1aqCd58B1KgUucgntcTc1nb3-8v4DP8cdem96pWzycMtUYOIwx5UQckgBLdF3wSylmoTm5AOfOmQynCCqtaC9wtPSCEhj7gwTj9sqnnfmkmRWPYK9KSnlo64wkYgerSzDevYA7A9gz6lQhgukX16ZcG2ncobQwTTjNgAJ4/s1600/1KB_GooglePlay_HeroBlog.png" /></a>
<p>Google Play empowers you to manage and distribute your innovative and trusted apps and games to billions of users around the world across the entire breadth of Android devices, and historically, all Android devices have managed memory in 4 KB pages.</p>
<p>As device manufacturers equip devices with more RAM to optimize performance, many will adopt larger page sizes like 16 KB. <a href="https://developer.android.com/about/versions/15/behavior-changes-all#16-kb" target="_blank">Android 15</a> introduces support for the increased page size, ensuring your app can run on these evolving devices and benefit from the associated performance gains.</p>
<blockquote><b>Starting November 1st, 2025, all new apps and updates to existing apps submitted to Google Play and targeting Android 15+ devices must support 16 KB page sizes.</b></blockquote>
<p>This is a key technical requirement to ensure your users can benefit from the performance enhancements on newer devices and prepares your apps for the platform's future direction of improved performance on newer hardware. Without recompiling to support 16 KB pages, your app might not function correctly on these devices when they become more widely available in future Android releases.</p>
<p>We’ve seen that 16 KB can help with:</p>
<ul><ul>
<li><b>Faster app launches:</b> See improvements ranging from 3% to 30% for various apps.</li>
<li><b>Improved battery usage:</b> Experience an average gain of 4.5%.</li>
<li><b>Quicker camera starts:</b> Launch the camera 4.5% to 6.6% faster.</li>
<li><b>Speedier system boot-ups:</b> Boot Android devices approximately 8% faster.</li>
</ul></ul>
<p><b>We recommend checking your apps early</b> especially for dependencies that might not yet be 16 KB compatible. Many popular SDK providers, like <a href="https://reactnative.dev/blog/2025/01/21/version-0.77" target="_blank">React Native</a> and <a href="https://github.com/flutter/flutter/issues/150168" target="_blank">Flutter</a>, already offer compatible versions. For game developers, several leading game engines, such as <a href="https://developer.android.com/games/engines/unity/unity-on-android#16-kb-page-support" target="_blank">Unity</a>, support 16 KB, with support for Unreal Engine coming soon.</p>
<h2><span style="-size: x-large;">Reaching 16 KB compatibility</span></h2>
<p>A substantial number of apps are already compatible, so your app may already work seamlessly with this requirement. For most of those that need to make adjustments, we expect the changes to be minimal.</p>
<ul><ul>
<li>Apps with no native code should be compatible without any changes at all.</li>
<li>Apps using libraries or SDKs that contain native code may need to update these to a compatible version.</li>
<li>Apps with native code may need to <a href="https://developer.android.com/guide/practices/page-sizes#compile-16-kb-alignment" target="_blank">recompile with a more recent toolchain</a> and check for any code with incompatible low level memory management.</li>
</ul></ul>
<p>Our December blog post, <a href="https://android-developers.googleblog.com/2024/12/get-your-apps-ready-for-16-kb-page-size-devices.html" target="_blank">Get your apps ready for 16 KB page size devices</a>, provides a more detailed technical explanation and guidance on how to prepare your apps.</p>
<h2><span style="-size: x-large;">Check your app's compatibility now</span></h2>
<p>It's easy to see if your app bundle already supports 16 KB memory page sizes. Visit the <a href="https://play.google.com/console/developers/app/bundle-explorer-selector" target="_blank">app bundle explorer</a> page in Play Console to check your app's build compliance and get guidance on where your app may need updating.</p>
<br />
<p>Beyond the app bundle explorer, make sure to also <a href="https://developer.android.com/guide/practices/page-sizes#test" target="_blank"><b>test your app in a 16 KB environment</b></a>. This will help you ensure users don’t experience any issues and that your app delivers its best performance.</p>
<p>For more information, check out the <a href="https://developer.android.com/guide/practices/page-sizes" target="_blank">full documentation</a>.</p>
<p>Thank you for your continued support in bringing delightful, fast, and high-performance experiences to users across the breadth of devices Play supports. We look forward to seeing the enhanced experiences you'll deliver with 16 KB support.</p>
Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-49375726207304924942025-05-07T14:00:00.000-07:002025-05-15T16:48:22.121-07:00Building delightful Android camera and media experiences<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2nT4w2e_EWOqk6E4zMVVYhUuQYgsukdfAKL_cEixIoRr7aKCFBjRB-et5zPB_lAASNgixHGvZLmevsjbQ8y75oN6bTPp1-ZQabGrF8umyNnlK-SGgzFXI9kHoqH4aUcDqsiXlEam9VkGf1y0Z4xWy0wlwHmU0c54toszEmZEj95E8-Nrh1zrBxO6fsp0/s1600/android-media-evergreen-option-2.png" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEh2nT4w2e_EWOqk6E4zMVVYhUuQYgsukdfAKL_cEixIoRr7aKCFBjRB-et5zPB_lAASNgixHGvZLmevsjbQ8y75oN6bTPp1-ZQabGrF8umyNnlK-SGgzFXI9kHoqH4aUcDqsiXlEam9VkGf1y0Z4xWy0wlwHmU0c54toszEmZEj95E8-Nrh1zrBxO6fsp0/s1600/android-media-evergreen-option-2.png" style="display: none;" />
<em>Posted by Donovan McMurray, Mayuri Khinvasara Khabya, Mozart Louis, and Nevin Mital – Developer Relations Engineers</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhP3GdxA9YnWdmWaV-WBmA1Iwmrkg0IiuG6HeLa7AQdMcJKnYSjAHlilcdXI0FKvsPU_9JPai55RXpC1P1MoqUow9hplafIccCV_AVAxASuvdxSAlaVICsK_PG73CFWx_6HCrACTZGmxDyQtlvN-ncB7z2JInOSRhQC-NqrCfCtfqeUdhHZhj7HaEsCKbU/s1600/0025-AfD-Android-New-Blog-Header-4209x1253%20%281%29.png"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEhP3GdxA9YnWdmWaV-WBmA1Iwmrkg0IiuG6HeLa7AQdMcJKnYSjAHlilcdXI0FKvsPU_9JPai55RXpC1P1MoqUow9hplafIccCV_AVAxASuvdxSAlaVICsK_PG73CFWx_6HCrACTZGmxDyQtlvN-ncB7z2JInOSRhQC-NqrCfCtfqeUdhHZhj7HaEsCKbU/s1600/0025-AfD-Android-New-Blog-Header-4209x1253%20%281%29.png" /></a>
<p>Hello Android Developers!</p>
<p>We are the Android Developer Relations Camera & Media team, and we’re excited to bring you something a little different today. Over the past several months, we’ve been hard at work writing sample code and building demos that showcase how to take advantage of all the great potential Android offers for building delightful user experiences.</p>
<p>Some of these efforts are available for you to explore now, and some you’ll see later throughout the year, but for this blog post we thought we’d share some of the learnings we gathered while going through this exercise.</p>
<p>Grab your favorite Android plush or rubber duck, and read on to see what we’ve been up to!</p>
<h2><span style="font-size: x-large;">Future-proof your app with Jetpack</span></h2>
<em>Nevin Mital</em>
<p>One of our focuses for the past several years has been improving the developer tools available for video editing on Android. This led to the creation of the <a href="https://developer.android.com/media/media3/transformer" target="_blank">Jetpack Media3 Transformer</a> APIs, which offer solutions for both single-asset and multi-asset video editing preview and export. Today, I’d like to focus on the <a href="https://github.com/androidx/media/tree/main/demos/composition" target="_blank">Composition demo app</a>, a sample app that showcases some of the multi-asset editing experiences that Transformer enables.</p>
<p>I started by adding a custom video compositor to demonstrate how you can arrange input video sequences into different layouts for your final composition, such as a 2x2 grid or a picture-in-picture overlay. You can customize this by implementing a <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/media3/effect/VideoCompositorSettings" target="_blank">VideoCompositorSettings</a></span> and overriding the <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/media3/effect/VideoCompositorSettings#getOverlaySettings%28int,long%29" target="_blank">getOverlaySettings</a></span> method. This object can then be set when building your Composition with <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/media3/transformer/Composition.Builder#setVideoCompositorSettings%28androidx.media3.common.VideoCompositorSettings%29" target="_blank">setVideoCompositorSettings</a></span>.</p>
<p>Here is an example for the 2x2 grid layout:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;">object : VideoCompositorSettings {
...
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">getOverlaySettings</span>(inputId: Int, presentationTimeUs: Long): OverlaySettings {
<span style="color: green; font-weight: bold;">return</span> <span style="color: blue;">when</span> (inputId) {
<span style="color: #666666;">0</span> -> { <span style="color: #408080; font-style: italic;">// First sequence is placed in the top left</span>
StaticOverlaySettings.Builder()
.setScale(<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>)
.setOverlayFrameAnchor(<span style="color: #666666;">0f</span>, <span style="color: #666666;">0f</span>) <span style="color: #408080; font-style: italic;">// Middle of overlay</span>
.setBackgroundFrameAnchor(-<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>) <span style="color: #408080; font-style: italic;">// Top-left section of background</span>
.build()
}
<span style="color: #666666;">1</span> -> { <span style="color: #408080; font-style: italic;">// Second sequence is placed in the top right</span>
StaticOverlaySettings.Builder()
.setScale(<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>)
.setOverlayFrameAnchor(<span style="color: #666666;">0f</span>, <span style="color: #666666;">0f</span>) <span style="color: #408080; font-style: italic;">// Middle of overlay</span>
.setBackgroundFrameAnchor(<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>) <span style="color: #408080; font-style: italic;">// Top-right section of background</span>
.build()
}
<span style="color: #666666;">2</span> -> { <span style="color: #408080; font-style: italic;">// Third sequence is placed in the bottom left</span>
StaticOverlaySettings.Builder()
.setScale(<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>)
.setOverlayFrameAnchor(<span style="color: #666666;">0f</span>, <span style="color: #666666;">0f</span>) <span style="color: #408080; font-style: italic;">// Middle of overlay</span>
.setBackgroundFrameAnchor(-<span style="color: #666666;">0.5f</span>, -<span style="color: #666666;">0.5f</span>) <span style="color: #408080; font-style: italic;">// Bottom-left section of background</span>
.build()
}
<span style="color: #666666;">3</span> -> { <span style="color: #408080; font-style: italic;">// Fourth sequence is placed in the bottom right</span>
StaticOverlaySettings.Builder()
.setScale(<span style="color: #666666;">0.5f</span>, <span style="color: #666666;">0.5f</span>)
.setOverlayFrameAnchor(<span style="color: #666666;">0f</span>, <span style="color: #666666;">0f</span>) <span style="color: #408080; font-style: italic;">// Middle of overlay</span>
.setBackgroundFrameAnchor(<span style="color: #666666;">0.5f</span>, -<span style="color: #666666;">0.5f</span>) <span style="color: #408080; font-style: italic;">// Bottom-right section of background</span>
.build()
}
<span style="color: green; font-weight: bold;">else</span> -> {
StaticOverlaySettings.Builder().build()
}
}
}
}
</pre></div>
<p>Since <span style="color: #0d904f; font-family: courier;">getOverlaySettings</span> also provides a presentation time, we can even animate the layout, such as in this picture-in-picture example:</p>
<br />
<p>Next, I spent some time migrating the Composition demo app to use Jetpack Compose. With complicated editing flows, it can help to take advantage of as much screen space as is available, so I decided to use the <a href="https://developer.android.com/develop/ui/compose/layouts/adaptive/build-a-supporting-pane-layout" target="_blank">supporting pane adaptive layout</a>. This way, the user can fine-tune their video creation on the preview screen, and export options are only shown at the same time on a larger display. Below, you can see how the UI dynamically adapts to the screen size on a foldable device, when switching from the outer screen to the inner screen and vice versa.</p>
<p>What’s great is that by using Jetpack Media3 and Jetpack Compose, these features also carry over seamlessly to other devices and form factors, such as the new Android XR platform. Right out-of-the-box, I was able to run the demo app in <a href="https://developer.android.com/develop/xr/jetpack-xr-sdk/transition-home-space-to-full-space" target="_blank">Home Space</a> with the 2D UI I already had. And with some small updates, I was even able to adapt the UI specifically for XR with features such as multiple panels, and to take further advantage of the extra space, an Orbiter with playback controls for the editing preview.</p>
<br />
<p>What’s great is that by using Jetpack Media3 and Jetpack Compose, these features also carry over seamlessly to other devices and form factors, such as the new Android XR platform. Right out-of-the-box, I was able to run the demo app in <a href="https://developer.android.com/develop/xr/jetpack-xr-sdk/transition-home-space-to-full-space" target="_blank">Home Space</a> with the 2D UI I already had. And with some small updates, I was even able to adapt the UI specifically for XR with features such as multiple panels, and to take further advantage of the extra space, an Orbiter with playback controls for the editing preview.</p>
<br />
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;">Orbiter(
position = OrbiterEdge.Bottom,
offset = EdgeOffset.inner(offset = MaterialTheme.spacing.standard),
alignment = Alignment.CenterHorizontally,
shape = SpatialRoundedCornerShape(CornerSize(<span style="color: #666666;">28.d</span>p))
) {
Row (horizontalArrangement = Arrangement.spacedBy(MaterialTheme.spacing.mini)) {
<span style="color: #408080; font-style: italic;">// Playback control for rewinding by 10 seconds</span>
FilledTonalIconButton({ viewModel.seekBack(<span style="color: #666666;">10</span>_000L) }) {
Icon(
painter = painterResource(id = R.drawable.rewind_10),
contentDescription = <span style="color: #ba2121;">"Rewind by 10 seconds"</span>
)
}
<span style="color: #408080; font-style: italic;">// Playback control for play/pause</span>
FilledTonalIconButton({ viewModel.togglePlay() }) {
Icon(
painter = painterResource(id = R.drawable.rounded_play_pause_24),
contentDescription =
<span style="color: green; font-weight: bold;">if</span>(viewModel.compositionPlayer.isPlaying) {
<span style="color: #ba2121;">"Pause preview playback"</span>
} <span style="color: green; font-weight: bold;">else</span> {
<span style="color: #ba2121;">"Resume preview playback"</span>
}
)
}
<span style="color: #408080; font-style: italic;">// Playback control for forwarding by 10 seconds</span>
FilledTonalIconButton({ viewModel.seekForward(<span style="color: #666666;">10</span>_000L) }) {
Icon(
painter = painterResource(id = R.drawable.forward_10),
contentDescription = <span style="color: #ba2121;">"Forward by 10 seconds"</span>
)
}
}
}
</pre></div>
<h2><span style="font-size: x-large;">Jetpack libraries unlock premium functionality incrementally</span></h2>
<em>Donovan McMurray</em>
<p>Not only do our Jetpack libraries have you covered by working consistently across existing and future devices, but they also open the doors to advanced functionality and custom behaviors to support all types of app experiences. In a nutshell, our Jetpack libraries aim to make the common case very accessible and easy, and it has hooks for adding more custom features later.</p>
<p>We’ve worked with many apps who have switched to a Jetpack library, built the basics, added their critical custom features, and actually saved developer time over their estimates. Let’s take a look at CameraX and how this incremental development can supercharge your process.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// Set up CameraX app with preview and image capture.</span>
<span style="color: #408080; font-style: italic;">// Note: setting the resolution selector is optional, and if not set,</span>
<span style="color: #408080; font-style: italic;">// then a default 4:3 ratio will be used.</span>
<span style="color: green; font-weight: bold;">val</span> aspectRatioStrategy = AspectRatioStrategy(
AspectRatio.RATIO_16_9, AspectRatioStrategy.FALLBACK_RULE_NONE)
<span style="color: green; font-weight: bold;">var</span> resolutionSelector = ResolutionSelector.Builder()
.setAspectRatioStrategy(aspectRatioStrategy)
.build()
<span style="color: green; font-weight: bold;">private</span> <span style="color: green; font-weight: bold;">val</span> previewUseCase = Preview.Builder()
.setResolutionSelector(resolutionSelector)
.build()
<span style="color: green; font-weight: bold;">private</span> <span style="color: green; font-weight: bold;">val</span> imageCaptureUseCase = ImageCapture.Builder()
.setResolutionSelector(resolutionSelector)
.setCaptureMode(ImageCapture.CAPTURE_MODE_MINIMIZE_LATENCY)
.build()
<span style="color: green; font-weight: bold;">val</span> useCaseGroupBuilder = UseCaseGroup.Builder()
.addUseCase(previewUseCase)
.addUseCase(imageCaptureUseCase)
cameraProvider.unbindAll()
camera = cameraProvider.bindToLifecycle(
<span style="color: green; font-weight: bold;">this</span>, <span style="color: #408080; font-style: italic;">// lifecycleOwner</span>
CameraSelector.DEFAULT_BACK_CAMERA,
useCaseGroupBuilder.build(),
)
</pre></div>
<p>After setting up the basic structure for CameraX, you can set up a simple UI with a camera preview and a shutter button. You can use the CameraX Viewfinder composable which displays a Preview stream from a CameraX SurfaceRequest.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// Create preview</span>
Box(
Modifier
.background(Color.Black)
.fillMaxSize(),
contentAlignment = Alignment.Center,
) {
surfaceRequest?.let {
CameraXViewfinder(
modifier = Modifier.fillMaxSize(),
implementationMode = ImplementationMode.EXTERNAL,
surfaceRequest = surfaceRequest,
)
}
Button(
onClick = onPhotoCapture,
shape = CircleShape,
colors = ButtonDefaults.buttonColors(containerColor = Color.White),
modifier = Modifier
.height(<span style="color: #666666;">75.d</span>p)
.width(<span style="color: #666666;">75.d</span>p),
)
}
<span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onPhotoCapture</span>() {
<span style="color: #408080; font-style: italic;">// Not shown: defining the ImageCapture.OutputFileOptions for</span>
<span style="color: #408080; font-style: italic;">// your saved images</span>
imageCaptureUseCase.takePicture(
outputOptions,
ContextCompat.getMainExecutor(context),
object : ImageCapture.OnImageSavedCallback {
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onError</span>(exc: ImageCaptureException) {
<span style="color: green; font-weight: bold;">val</span> msg = <span style="color: #ba2121;">"Photo capture failed."</span>
Toast.makeText(context, msg, Toast.LENGTH_SHORT).show()
}
<span style="color: green; font-weight: bold;">override</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">onImageSaved</span>(output: ImageCapture.OutputFileResults) {
<span style="color: green; font-weight: bold;">val</span> savedUri = output.savedUri
<span style="color: green; font-weight: bold;">if</span> (savedUri != <span style="color: green; font-weight: bold;">null</span>) {
<span style="color: #408080; font-style: italic;">// Do something with the savedUri if needed</span>
} <span style="color: green; font-weight: bold;">else</span> {
<span style="color: green; font-weight: bold;">val</span> msg = <span style="color: #ba2121;">"Photo capture failed."</span>
Toast.makeText(context, msg, Toast.LENGTH_SHORT).show()
}
}
},
)
}
</pre></div>
<p>You’re already on track for a solid camera experience, but what if you wanted to add some extra features for your users? Adding filters and effects are easy with CameraX’s Media3 effect integration, which is <a href="https://android-developers.googleblog.com/2024/12/whats-new-in-camerax-140-and-jetpack-compose-support.html" target="_blank">one of the new features introduced in CameraX 1.4.0</a>.</p>
<p>Here’s how simple it is to add a black and white filter from Media3’s built-in effects.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> media3Effect = Media3Effect(
application,
PREVIEW or IMAGE_CAPTURE,
ContextCompat.getMainExecutor(application),
{},
)
media3Effect.setEffects(listOf(RgbFilter.createGrayscaleFilter()))
useCaseGroupBuilder.addEffect(media3Effect)
</pre></div>
<p>The <a href="https://developer.android.com/reference/androidx/camera/media3/effect/Media3Effect" target="_blank">Media3Effect</a> object takes a <a href="https://developer.android.com/reference/android/content/Context" target="_blank">Context</a>, a bitwise representation of the <a href="https://developer.android.com/reference/androidx/camera/core/CameraEffect#constants_1" target="_blank">use case constants</a> for targeted <a href="https://developer.android.com/reference/androidx/camera/core/UseCase" target="_blank">UseCases</a>, an <a href="https://developer.android.com/reference/java/util/concurrent/Executor.html" target="_blank">Executor</a>, and an error listener. Then you set the list of effects you want to apply. Finally, you add the effect to the <span style="color: #0d904f; font-family: courier;">useCaseGroupBuilder</span> we defined earlier.</p>
<br />
<p>There are many other built-in effects you can add, too! See the Media3 <a href="https://developer.android.com/reference/androidx/media3/common/Effect" target="_blank">Effect</a> documentation for more options, like brightness, color lookup tables (LUTs), contrast, blur, and many other effects.</p>
<p>To take your effects to yet another level, it’s also possible to define your own effects by implementing the <span style="font-family: courier;"><a href="https://developer.android.com/reference/androidx/media3/effect/GlEffect" target="_blank">GlEffect</a></span> interface, which acts as a factory of <span style="font-family: courier;"><a href="https://developer.android.com/reference/androidx/media3/effect/GlShaderProgram" target="_blank">GlShaderPrograms</a></span>. You can implement a <span style="font-family: courier;"><a href="https://developer.android.com/reference/androidx/media3/effect/BaseGlShaderProgram" target="_blank">BaseGlShaderProgram</a></span>’s <span style="font-family: courier;"><a href="https://developer.android.com/reference/androidx/media3/effect/BaseGlShaderProgram" target="_blank">drawFrame()</a></span> method to implement a custom effect of your own. A minimal implementation should tell your graphics library to use its shader program, bind the shader program's vertex attributes and uniforms, and issue a drawing command.</p>
<p>Jetpack libraries meet you where you are and your app’s needs. Whether that be a simple, fast-to-implement, and reliable implementation, or custom functionality that helps the critical user journeys in your app stand out from the rest, Jetpack has you covered!</p>
<h2><span style="font-size: x-large;">Jetpack offers a foundation for innovative AI Features</span></h2>
<i>Mayuri Khinvasara Khabya</i>
<p>Just as Donovan demonstrated with CameraX for capture, Jetpack <a href="https://developer.android.com/media" target="_blank">Media3</a> provides a reliable, customizable, and feature-rich solution for playback with ExoPlayer. The AI Samples app builds on this foundation to delight users with helpful and enriching AI-driven additions.</p>
<p>In today's rapidly evolving digital landscape, users expect more from their media applications. Simply playing videos is no longer enough. Developers are constantly seeking ways to enhance user experiences and provide deeper engagement. Leveraging the power of Artificial Intelligence (AI), particularly when built upon robust media frameworks like Media3, offers exciting opportunities. Let’s take a look at some of the ways we can transform the way users interact with video content:</p>
<ul><ul>
<li><b>Empowering Video Understanding:</b> The core idea is to use AI, specifically multimodal models like the Gemini Flash and Pro models, to analyze video content and extract meaningful information. This goes beyond simply playing a video; it's about understanding what's in the video and making that information readily accessible to the user.</li></ul><ul>
<li><b>Actionable Insights:</b> The goal is to transform raw video into summaries, insights, and interactive experiences. This allows users to quickly grasp the content of a video and find specific information they need or learn something new!</li></ul><ul>
<li><b>Accessibility and Engagement:</b> AI helps make videos more accessible by providing features like summaries, translations, and descriptions. It also aims to increase user engagement through interactive features.</li>
</ul></ul>
<h3><span style="font-size: large;">A Glimpse into AI-Powered Video Journeys</span></h3>
<p>The following example demonstrates potential video journies enhanced by artificial intelligence. This sample integrates several components, such as ExoPlayer and Transformer from Media3; the Firebase SDK (leveraging Vertex AI on Android); and Jetpack Compose, ViewModel, and StateFlow. The code will be available soon on <a href="https://github.com/android/ai-samples/tree/main/ai-catalog" target="_blank">Github</a>.</p>
<br />
<p>There are two experiences in particular that I’d like to highlight:</p>
<ul><ul>
<li><b>HDR Thumbnails:</b> AI can help identify key moments in the video that could make for good thumbnails. With those timestamps, you can use the new <span style="font-family: courier;"><a href="https://developer.android.com/reference/kotlin/androidx/media3/transformer/ExperimentalFrameExtractor" target="_blank">ExperimentalFrameExtractor</a></span> API from Media3 to extract HDR thumbnails from videos, providing richer visual previews.</li>
<li><b>Text-to-Speech:</b> AI can be used to convert textual information derived from the video into spoken audio, enhancing accessibility. On Android you can also choose to play audio in different languages and dialects thus enhancing personalization for a wider audience.</li>
</ul></ul>
<h3><span style="font-size: large;">Using the right AI solution</span></h3>
<p>Currently, only cloud models support video inputs, so we went ahead with a cloud-based solution.Iintegrating Firebase in our sample empowers the app to:</p>
<ul><ul>
<li>Generate real-time, concise video summaries automatically.</li>
<li>Produce comprehensive content metadata, including chapter markers and relevant hashtags.</li>
<li>Facilitate seamless multilingual content translation.</li>
</ul></ul>
<p>So how do you actually interact with a video and work with Gemini to process it? First, send your video as an input parameter to your prompt:</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"><span style="color: #008000; font-weight: bold">val</span> promptData =
<span style="color: #BA2121">"Summarize this video in the form of top 3-4 takeaways only. Write in the form of bullet points. Don't assume if you don't know"</span>
<span style="color: #008000; font-weight: bold">val</span> generativeModel = Firebase.vertexAI.generativeModel(<span style="color: #BA2121">"gemini-2.0-flash"</span>)
_outputText.value = OutputTextState.Loading
viewModelScope.launch(Dispatchers.IO) {
<span style="color: #008000; font-weight: bold">try</span> {
<span style="color: #008000; font-weight: bold">val</span> requestContent = content {
fileData(videoSource.toString(), <span style="color: #BA2121">"video/mp4"</span>)
text(prompt)
}
<span style="color: #008000; font-weight: bold">val</span> outputStringBuilder = StringBuilder()
generativeModel.generateContentStream(requestContent).collect { response ->
outputStringBuilder.append(response.text)
_outputText.value = OutputTextState.Success(outputStringBuilder.toString())
}
_outputText.value = OutputTextState.Success(outputStringBuilder.toString())
} <span style="color: #008000; font-weight: bold">catch</span> (error: Exception) {
_outputText.value = error.localizedMessage?.let { OutputTextState.Error(it) }
}
}
</pre></div>
<p>Notice there are two key components here:</p>
<ul><ul>
<li><b>FileData:</b> This component integrates a video into the query.
<li><b>Prompt:</b> This asks the user what specific assistance they need from AI in relation to the provided video.
</ul></ul>
<p>Of course, you can finetune your prompt as per your requirements and get the responses accordingly.</p>
<p>In conclusion, by harnessing the capabilities of Jetpack Media3 and integrating AI solutions like Gemini through Firebase, you can significantly elevate video experiences on Android. This combination enables advanced features like video summaries, enriched metadata, and seamless multilingual translations, ultimately enhancing accessibility and engagement for users. As these technologies continue to evolve, the potential for creating even more dynamic and intelligent video applications is vast.</p>
<h2><span style="font-size: x-large;">Go above-and-beyond with specialized APIs</span></h2>
<i>Mozart Louis</i>
<p>Android 16 introduces the new audio PCM Offload mode which can reduce the power consumption of audio playback in your app, leading to longer playback time and increased user engagement. Eliminating the power anxiety greatly enhances the user experience.</p>
<p><a href="https://github.com/google/oboe" target="_blank">Oboe</a> is Android’s premiere audio api that developers are able to use to create high performance, low latency audio apps. A new feature is being added to the Android NDK and Android 16 called Native PCM Offload playback.</p>
<p>Offload playback helps save battery life when playing audio. It works by sending a large chunk of audio to a special part of the device's hardware (a DSP). This allows the CPU of the device to go into a low-power state while the DSP handles playing the sound. This works with uncompressed audio (like PCM) and compressed audio (like MP3 or AAC), where the DSP also takes care of decoding.</p>
<p>This can result in significant power saving while playing back audio and is perfect for applications that play audio in the background or while the screen is off (think audiobooks, podcasts, music etc).</p>
<p>We created the <a href="https://github.com/google/oboe/tree/powerplay-sample" target="_blank">sample app PowerPlay</a> to demonstrate how to implement these features using the latest NDK version, C++ and Jetpack Compose.</p>
<p>Here are the most important parts!</p>
<p>First order of business is to assure the device supports audio offload of the file attributes you need. In the example below, we are checking if the device support audio offload of stereo, float PCM file with a sample rate of 48000Hz.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%"> <span style="color: #008000; font-weight: bold">val</span> format = AudioFormat.Builder()
.setEncoding(AudioFormat.ENCODING_PCM_FLOAT)
.setSampleRate(<span style="color: #666666">48000</span>)
.setChannelMask(AudioFormat.CHANNEL_OUT_STEREO)
.build()
<span style="color: #008000; font-weight: bold">val</span> attributes =
AudioAttributes.Builder()
.setContentType(AudioAttributes.CONTENT_TYPE_MUSIC)
.setUsage(AudioAttributes.USAGE_MEDIA)
.build()
<span style="color: #008000; font-weight: bold">val</span> isOffloadSupported =
<span style="color: #008000; font-weight: bold">if</span> (Build.VERSION.SDK_INT >= Build.VERSION_CODES.Q) {
AudioManager.isOffloadedPlaybackSupported(format, attributes)
} <span style="color: #008000; font-weight: bold">else</span> {
<span style="color: #008000; font-weight: bold">false</span>
}
<span style="color: #008000; font-weight: bold">if</span> (isOffloadSupported) {
player.initializeAudio(PerformanceMode::POWER_SAVING_OFFLOADED)
}
</pre></div>
<p>Once we know the device supports audio offload, we can confidently set the Oboe audio streams’ performance mode to the new performance mode option, <span style="font-family: courier;"><a href="https://github.com/google/oboe/blob/powerplay-sample/include/oboe/Definitions.h#L293" target="_blank">PerformanceMode::POWER_SAVING_OFFLOADED</a></span>.</p>
<!-- Kotlin --><div style="background: #f8f8f8; overflow:auto;width:auto;border:0;"><pre style="margin: 0; line-height: 125%">Player::initializeAudio(<span style="color: #B00040">bool</span> isOffloadSupported) {
<span style="color: #408080; font-style: italic">// Create an audio stream</span>
AudioStreamBuilder builder;
builder.setChannelCount(mChannelCount);
builder.setDataCallback(mDataCallback);
builder.setFormat(AudioFormat::Float);
builder.setSampleRate(<span style="color: #666666">48000</span>);
builder.setErrorCallback(mErrorCallback);
builder.setPresentationCallback(mPresentationCallback);
<span style="color: #008000; font-weight: bold">if</span> (isOffloadSupported) {
builder.setPerformanceMode(oboe::PerformanceMode::POWER_SAVING_OFFLOADED);
builder.setFramesPerDataCallback(<span style="color: #666666">128</span>); <span style="color: #408080; font-style: italic">// set a low frame buffer amount</span>
} <span style="color: #008000; font-weight: bold">else</span> {
builder.setPerformanceMode(oboe::PerformanceMode::LowLatency
}
builder.setSharingMode(SharingMode::Exclusive);
builder.setSampleRateConversionQuality(SampleRateConversionQuality::Medium);
Result result = builder.openStream(mAudioStream);
}
</pre></div>
<p>Now when audio is played back, it will be offloading audio to the DSP, helping save power when playing back audio.</p>
<p>There is more to this feature that will be covered in a future blog post, fully detailing out all of the new available APIs that will help you optimize your audio playback experience!</p>
<h2><span style="font-size: x-large;">What’s next</span></h2>
<p>Of course, we were only able to share the tip of the iceberg with you here, so to dive deeper into the samples, check out the following links:</p>
<ul><ul>
<li><a href="https://github.com/androidx/media/tree/main/demos/composition" target="_blank">Jetpack Media3 Composition Demo app</a></li>
<li><a href="https://github.com/android/socialite" target="_blank">SociaLite</a></li>
<li><a href="https://github.com/android/ai-samples/" target="_blank">AI Samples</a></li>
<li><a href="https://github.com/google/oboe/tree/powerplay-sample/samples/powerplay" target="_blank">PowerPlay</a></li>
</ul></ul>
<p>Hopefully these examples have inspired you to explore what new and fascinating experiences you can build on Android. Tune in to <a href="https://io.google/2025/explore/technical-session-19" target="_blank">our session at Google I/O</a> in a couple weeks to learn even more about use-cases supported by solutions like Jetpack CameraX and Jetpack Media3!</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0tag:blogger.com,1999:blog-6755709643044947179.post-23636472037040086352025-05-07T09:00:00.000-07:002025-05-07T09:00:00.110-07:00Zoho Achieves 6x Faster Logins with Passkey and Credential Manager Integration<meta content="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5qpwVAje1AjhxTuWNrSBSelhGRekoGR0BYjBIJIkK1OEur5I0AXf8QRJJ4zvt1YIPu6UHS-byMolDEWSwVreJInX3OnOAQbd15rKH_YV04B5Bq2qrsY0ZU4x6ULPMWiqHtmMo4nECQkaeSFgylV98zp0QWriwSBjjbsufqXSGlNHuN6L3esFrVLZ_7d4/s1600/ANDDM_Zoho_Header.gif" name="twitter:image"></meta>
<img src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5qpwVAje1AjhxTuWNrSBSelhGRekoGR0BYjBIJIkK1OEur5I0AXf8QRJJ4zvt1YIPu6UHS-byMolDEWSwVreJInX3OnOAQbd15rKH_YV04B5Bq2qrsY0ZU4x6ULPMWiqHtmMo4nECQkaeSFgylV98zp0QWriwSBjjbsufqXSGlNHuN6L3esFrVLZ_7d4/s1600/ANDDM_Zoho_Header.gif" style="display: none;" />
<em>Posted by <a href="https://x.com/thedroidlady" target="_blank">Niharika Arora</a> – Senior Developer Relations Engineer, Joseph Lewis – Staff Technical Writer, and Kumareshwaran Sreedharan – Product Manager, Zoho.</em>
<a href="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5qpwVAje1AjhxTuWNrSBSelhGRekoGR0BYjBIJIkK1OEur5I0AXf8QRJJ4zvt1YIPu6UHS-byMolDEWSwVreJInX3OnOAQbd15rKH_YV04B5Bq2qrsY0ZU4x6ULPMWiqHtmMo4nECQkaeSFgylV98zp0QWriwSBjjbsufqXSGlNHuN6L3esFrVLZ_7d4/s1600/ANDDM_Zoho_Header.gif"><img border="0" data-original-height="800" data-original-width="100%" src="https://blogger.googleusercontent.com/img/b/R29vZ2xl/AVvXsEi5qpwVAje1AjhxTuWNrSBSelhGRekoGR0BYjBIJIkK1OEur5I0AXf8QRJJ4zvt1YIPu6UHS-byMolDEWSwVreJInX3OnOAQbd15rKH_YV04B5Bq2qrsY0ZU4x6ULPMWiqHtmMo4nECQkaeSFgylV98zp0QWriwSBjjbsufqXSGlNHuN6L3esFrVLZ_7d4/s1600/ANDDM_Zoho_Header.gif" /></a>
<p>As an Android developer, you're constantly looking for ways to enhance security, improve user experience, and streamline development. Zoho, a comprehensive cloud-based software suite focused on security and seamless experiences, achieved significant improvements by adopting passkeys in their <a href="https://play.google.com/store/apps/details?id=com.zoho.accounts.oneauth&hl=en_IN" target="_blank">OneAuth</a> Android app.</p>
<p>Since integrating passkeys in 2024, Zoho achieved <b>login speeds up to 6x faster</b> than previous methods and a <b>31% month-over-month (MoM) growth in passkey adoption</b>.</p>
<p>This case study examines Zoho's adoption of passkeys and Android's <a href="https://developer.android.com/training/sign-in/passkeys" target="_blank">Credential Manager API</a> to address authentication difficulties. It details the technical implementation process and highlights the impactful results.</p>
<h2><span style="font-size: x-large;">Overcoming authentication challenges</span></h2>
<p>Zoho utilizes a combination of authentication methods to protect user accounts. This included Zoho <a href="https://www.zoho.com/accounts/oneauth/" target="_blank">OneAuth</a>, their own multi-factor authentication (MFA) solution, which supported both password-based and passwordless authentication using push notifications, QR codes, and time-based one-time passwords (TOTP). Zoho also supported federated logins, allowing authentication through Security Assertion Markup Language (SAML) and other third-party identity providers.</p>
<h3><span style="font-size: large;">Challenges</span></h3>
<p>Zoho, like many organizations, aimed to improve authentication security and user experience while reducing operational burdens. The primary challenges that led to the adoption of passkeys included:</p>
<ul><ul>
<li>Security vulnerabilities: Traditional password-based methods left users susceptible to phishing attacks and password breaches.</li></ul><ul>
<li>User friction: Password fatigue led to forgotten passwords, frustration, and increased reliance on cumbersome recovery processes.</li></ul><ul>
<li>Operational inefficiencies: Handling password resets and MFA issues generated significant support overhead.</li></ul><ul>
<li>Scalability concerns: A growing user base demanded a more secure and efficient authentication solution.</li>
</ul></ul>
<h3><span style="font-size: large;">Why the shift to passkeys?</span></h3>
<p>Passkeys were implemented in Zoho's apps to address authentication challenges by offering a passwordless approach that significantly improves security and user experience. This solution leverages phishing-resistant authentication, cloud-synchronized credentials for effortless cross-device access, and biometrics (such as a fingerprint or facial recognition), PIN, or pattern for secure logins, thereby reducing the vulnerabilities and inconveniences associated with traditional passwords.</p>
<p>By adopting passkeys with Credential Manager, Zoho cut login times by <b>up to 6x</b>, slashed password-related support costs, and saw <b>strong</b> user adoption – <b>doubling</b> passkey sign-ins in 4 months with <b>31% MoM growth</b>. Zoho users now enjoy <b>faster, easier logins and phishing-resistant security</b>.</p>

<h2><span style="font-size: x-large;">Implementation with Credential Manager on Android</span></h2>
<p>So, how did Zoho achieve these results? They used Android's Credential Manager API, the recommended Jetpack library for implementing authentication on Android.</p>
<p>Credential Manager provides a unified API that simplifies handling of the various authentication methods. Instead of juggling different APIs for passwords, passkeys, and federated logins (like Sign in with Google), you use a single interface.</p>
<p>Implementing passkeys at Zoho required both client-side and server-side adjustments. Here's a detailed breakdown of the passkey creation, sign-in, and server-side implementation process.</p>
<h3><span style="font-size: large;">Passkey creation</span></h3>
<br />
<p>To <a href="https://developer.android.com/identity/sign-in/credential-manager#registration-flows" target="_blank">create a passkey</a>, the app first retrieves configuration details from Zoho's server. This process includes a unique verification, such as a fingerprint or facial recognition. This verification data, formatted as a <span style="color: #0d904f; font-family: courier;">requestJson</span> string), is used by the app to build a <span style="color: #0d904f; font-family: courier;">CreatePublicKeyCredentialRequest</span>. The app then calls the <span style="color: #0d904f; font-family: courier;">credentialManager.createCredential</span> method, which prompts the user to authenticate using their device screen lock (biometrics, fingerprint, PIN, etc.).</p>
<p>Upon successful user confirmation, the app receives the new passkey credential data, sends it back to Zoho's server for verification, and the server then stores the passkey information linked to the user's account. Failures or user cancellations during the process are caught and handled by the app.</p>
<h3><span style="font-size: large;">Sign-in</span></h3>
<p>The Zoho Android app initiates the <a href="https://developer.android.com/identity/sign-in/credential-manager#sign-in" target="_blank">passkey sign-in</a> process by requesting sign-in options, including a unique <span style="color: #0d904f; font-family: courier;">challenge</span>, from Zoho's backend server. The app then uses this data to construct a <span style="color: #0d904f; font-family: courier;">GetCredentialRequest</span>, indicating it will authenticate with a passkey. It then invokes the Android <span style="color: #0d904f; font-family: courier;">CredentialManager.getCredential()</span> API with this request. This action triggers a standardized Android system interface, prompting the user to choose their Zoho account (if multiple passkeys exist) and authenticate using their device's configured screen lock (fingerprint, face scan, or PIN). After successful authentication, Credential Manager returns a signed assertion (proof of login) to the Zoho app. The app forwards this assertion to Zoho's server, which verifies the signature against the user's stored public key and validates the challenge, completing the secure sign-in process.</p>
<h3><span style="font-size: large;">Server-side implementation</span></h3>
<p>Zoho's transition to supporting passkeys benefited from their backend systems already being FIDO WebAuthn compliant, which streamlined the server-side implementation process. However, specific modifications were still necessary to fully integrate passkey functionality.</p>
<p>The most significant challenge involved adapting the credential storage system. Zoho's existing authentication methods, which primarily used passwords and FIDO security keys for multi-factor authentication, required different storage approaches than passkeys, which are based on cryptographic public keys. To address this, Zoho implemented a new database schema specifically designed to securely store passkey public keys and related data according to WebAuthn protocols. This new system was built alongside a lookup mechanism to validate and retrieve credentials based on user and device information, ensuring backward compatibility with older authentication methods.</p>
<p>Another server-side adjustment involved implementing the ability to handle requests from Android devices. Passkey requests originating from Android apps use a unique origin format (<span style="color: #0d904f; font-family: courier;">android:apk-key-hash:example</span>) that is distinct from standard web origins that use a URI-based format (<span style="color: #0d904f; font-family: courier;">https://example.com/app</span>). The server logic needed to be updated to correctly parse this format, extract the SHA-256 fingerprint hash of the app's signing certificate, and validate it against a pre-registered list. This verification step ensures that authentication requests genuinely originate from Zoho's Android app and protects against phishing attacks.</p>
<p>This code snippet demonstrates how the server checks for the Android-specific origin format and validates the certificate hash:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">val</span> origin: String = clientData.getString(<span style="color: #ba2121;">"origin"</span>)
<span style="color: green; font-weight: bold;">if</span> (origin.startsWith(<span style="color: #ba2121;">"android:apk-key-hash:"</span>)) {
<span style="color: green; font-weight: bold;">val</span> originSplit: List<String> = origin.split(<span style="color: #ba2121;">":"</span>)
<span style="color: green; font-weight: bold;">if</span> (originSplit.size > <span style="color: #666666;">3</span>) {
<span style="color: green; font-weight: bold;">val</span> androidOriginHashDecoded: ByteArray = Base64.getDecoder().decode(originSplit[<span style="color: #666666;">3</span>])
<span style="color: green; font-weight: bold;">if</span> (!androidOriginHashDecoded.contentEquals(oneAuthSha256FingerPrint)) {
<span style="color: green; font-weight: bold;">throw</span> <span style="color: blue;">IAMException</span>(IAMErrorCode.WEBAUTH003)
}
} <span style="color: green; font-weight: bold;">else</span> {
<span style="color: #408080; font-style: italic;">// Optional: Handle the case where the origin string is malformed }</span>
}
</pre></div>
<h3><span style="font-size: large;">Error handling</span></h3>
<p>Zoho implemented robust <a href="https://developer.android.com/identity/sign-in/credential-manager-troubleshooting-guide" target="_blank">error handling mechanisms</a> to manage both user-facing and developer-facing errors. A common error, <span style="color: #0d904f; font-family: courier;">CreateCredentialCancellationException</span>, appeared when users manually canceled their passkey setup. Zoho tracked the frequency of this error to assess potential UX improvements. Based on Android's <a href="https://developer.android.com/design/ui/mobile/guides/patterns/passkeys" target="_blank">UX recommendations</a>, Zoho took steps to better educate their users about passkeys, ensure users were aware of passkey availability, and promote passkey adoption during subsequent sign-in attempts.</p>
<p>This code example demonstrates Zoho's approach for how they handled their most common passkey creation errors:</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">private</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">handleFailure</span>(e: CreateCredentialException) {
<span style="color: green; font-weight: bold;">val</span> msg = <span style="color: green; font-weight: bold;">when</span> (e) {
<span style="color: green; font-weight: bold;">is</span> CreateCredentialCancellationException -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_SETUP_CANCELLED"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"The operation was canceled by the user."</span>
}
<span style="color: green; font-weight: bold;">is</span> CreateCredentialInterruptedException -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_SETUP_INTERRUPTED"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"Passkey setup was interrupted. Please try again."</span>
}
<span style="color: green; font-weight: bold;">is</span> CreateCredentialProviderConfigurationException -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_PROVIDER_MISCONFIGURED"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"Credential provider misconfigured. Contact support."</span>
}
<span style="color: green; font-weight: bold;">is</span> CreateCredentialUnknownException -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_SETUP_UNKNOWN_ERROR"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"An unknown error occurred during Passkey setup."</span>
}
<span style="color: green; font-weight: bold;">is</span> CreatePublicKeyCredentialDomException -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_WEB_AUTHN_ERROR"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"Passkey creation failed: ${e.domError}"</span>
}
<span style="color: green; font-weight: bold;">else</span> -> {
Analytics.addAnalyticsEvent(eventProtocol: <span style="color: #ba2121;">"PASSKEY_SETUP_FAILED"</span>, GROUP_NAME)
Analytics.addNonFatalException(e)
<span style="color: #ba2121;">"An unexpected error occurred. Please try again."</span>
}
}
}
</pre></div>
<h3><span style="font-size: large;">Testing passkeys in intranet environments</span></h3>
<p>Zoho faced an initial challenge in testing passkeys within a closed intranet environment. The Google Password Manager <a href="https://developer.android.com/identity/sign-in/credential-manager#add-support-dal" target="_blank">verification process</a> for passkeys requires public domain access to validate the relying party (RP) domain. However, Zoho's internal testing environment lacked this public Internet access, causing the verification process to fail and hindering successful passkey authentication testing. To overcome this, Zoho created a publicly accessible test environment, which included hosting a temporary server with an <a href="https://developer.android.com/identity/sign-in/credential-manager#add-support-dal" target="_blank">asset link file</a> and domain validation.</p>
<p>This example from the <span style="color: #0d904f; font-family: courier;">assetlinks.json</span> file used in Zoho's public test environment demonstrates how to associate the relying party domain with the specified Android app for passkey validation.</p>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #7d9029;">[</span>
<span style="color: #7d9029;"> {</span>
<span style="color: #7d9029;"> "relation": [</span>
<span style="color: #7d9029;"> "delegate_permission/common.handle_all_urls",</span>
<span style="color: #7d9029;"> "delegate_permission/common.get_login_creds"</span>
<span style="color: #7d9029;"> ]</span>,
<span style="color: #ba2121;">"target"</span>: {
<span style="color: #ba2121;">"namespace"</span>: <span style="color: #ba2121;">"android_app"</span>,
<span style="color: #ba2121;">"package_name"</span>: <span style="color: #ba2121;">"com.zoho.accounts.oneauth"</span>,
<span style="color: #ba2121;">"sha256_cert_fingerprints"</span>: [
<span style="color: #ba2121;">"SHA_HEX_VALUE"</span>
]
}
}
]
</pre></div>
<h3><span style="font-size: large;">Integrate with an existing FIDO server</span></h3>
<p>Android's passkey system utilizes the modern FIDO2 WebAuthn standard. This standard requires requests in a specific JSON format, which helps maintain consistency between native applications and web platforms. To enable Android passkey support, Zoho did minor compatibility and structural changes to correctly generate and process requests that adhere to the required FIDO2 JSON structure.</p>
<p>This server update involved several specific technical adjustments:</p>
<ul><ul>
<p>1. <b>Encoding conversion:</b> The server converts the Base64 URL encoding (commonly used in WebAuthn for fields like credential IDs) to standard Base64 encoding before it stores the relevant data. The snippet below shows how a <span style="color: #0d904f; font-family: courier;">rawId</span> might be encoded to standard Base64:</p>
</ul></ul>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: #408080; font-style: italic;">// Convert rawId bytes to a standard Base64 encoded string for storage</span>
<span style="color: green; font-weight: bold;">val</span> base64RawId: String = Base64.getEncoder().encodeToString(rawId.toByteArray())
</pre></div>
<ul><ul>
<p>2. <b>Transport list format:</b> To ensure consistent data processing, the server logic handles lists of transport mechanisms (such as USB, NFC, and Bluetooth, which specify how the authenticator communicated) as JSON arrays.</p>
<p>3. <b>Client data alignment:</b> The Zoho team adjusted how the server encodes and decodes the <span style="color: #0d904f; font-family: courier;">clientDataJson</span> field. This ensures the data structure aligns precisely with the expectations of Zoho’s existing internal APIs. The example below illustrates part of the conversion logic applied to client data before the server processes it:</p>
</ul></ul>
<!--Kotlin--><div style="background: rgb(248, 248, 248); border: 0px; overflow: auto; width: auto;"><pre style="line-height: 125%; margin: 0px;"><span style="color: green; font-weight: bold;">private</span> <span style="color: green; font-weight: bold;">fun</span> <span style="color: blue;">convertForServer</span>(type: String): String {
<span style="color: green; font-weight: bold;">val</span> clientDataBytes = BaseEncoding.base64().decode(type)
<span style="color: green; font-weight: bold;">val</span> clientDataJson = JSONObject(String(clientDataBytes, StandardCharsets.UTF_8))
<span style="color: green; font-weight: bold;">val</span> clientJson = JSONObject()
<span style="color: green; font-weight: bold;">val</span> challengeFromJson = clientDataJson.getString(<span style="color: #ba2121;">"challenge"</span>)
<span style="color: #408080; font-style: italic;">// 'challenge' is a technical identifier/token, not localizable text.</span>
clientJson.put(<span style="color: #ba2121;">"challenge"</span>, BaseEncoding.base64Url()
.encode(challengeFromJson.toByteArray(StandardCharsets.UTF_8)))
clientJson.put(<span style="color: #ba2121;">"origin"</span>, clientDataJson.getString(<span style="color: #ba2121;">"origin"</span>))
clientJson.put(<span style="color: #ba2121;">"type"</span>, clientDataJson.getString(<span style="color: #ba2121;">"type"</span>))
clientJson.put(<span style="color: #ba2121;">"androidPackageName"</span>, clientDataJson.getString(<span style="color: #ba2121;">"androidPackageName"</span>))
<span style="color: green; font-weight: bold;">return</span> BaseEncoding.base64().encode(clientJson.toString().toByteArray())
}
</pre></div>
<h3><span style="font-size: large;">User guidance and authentication preferences</span></h3>
<p>A central part of Zoho's passkey strategy involved encouraging user adoption while providing flexibility to align with different organizational requirements. This was achieved through careful UI design and policy controls.</p>
<p>Zoho recognized that organizations have varying security needs. To accommodate this, Zoho implemented:</p>
<ul><ul>
<li><b>Admin enforcement:</b> Through the <a href="https://www.zoho.com/directory/" target="_blank">Zoho Directory</a> admin panel, administrators can designate passkeys as the mandatory, default authentication method for their entire organization. When this policy is enabled, employees are required to set up a passkey upon their next login and use it going forward.</li></ul><ul>
<li><b>User choice:</b> If an organization does not enforce a specific policy, individual users maintain control. They can choose their preferred authentication method during login, selecting from passkeys or other configured options via their authentication settings.</li>
</ul></ul>
<p>To make adopting passkeys appealing and straightforward for end-users, Zoho implemented:</p>
<ul><ul>
<li><b>Easy setup:</b> Zoho integrated passkey setup directly into the Zoho OneAuth mobile app (available for both <a href="https://play.google.com/store/apps/details?id=com.zoho.accounts.oneauth&hl=en_IN" target="_blank">Android</a> and <a href="https://apps.apple.com/in/app/authenticator-app-oneauth/id1142928979" target="_blank">iOS</a>). Users can conveniently configure their passkeys within the app at any time, smoothing the transition.</li></ul><ul>
<li><b>Consistent access:</b> Passkey support was implemented across key user touchpoints, ensuring users can register and authenticate using passkeys via:</li></ul><ul>
<ul><ul>
<li>The Zoho OneAuth mobile app (Android & iOS);</li></ul><ul>
<li>Their Zoho web <a href="https://accounts.zoho.com/home#multiTFA/pfamodes" target="_blank">accounts</a> page.</li>
</ul></ul></ul></ul>
<p>This method ensured that the process of setting up and using passkeys was accessible and integrated into the platforms they already use, regardless of whether it was mandated by an admin or chosen by the user. You can learn more about how to create smooth user flows for passkey authentication by exploring our comprehensive <a href="https://developer.android.com/design/ui/mobile/guides/patterns/passkeys" target="_blank">passkeys user experience guide</a>.</p>
<h2><span style="font-size: x-large;">Impact on developer velocity and integration efficiency</span></h2>
<p>Credential Manager, as a unified API, also helped improve developer productivity compared to older sign-in flows. It reduced the complexity of handling multiple authentication methods and APIs separately, leading to faster integration, from months to weeks, and fewer implementation errors. This collectively streamlined the sign-in process and improved overall reliability.</p>
<p>By implementing passkeys with Credential Manager, Zoho achieved significant, measurable improvements across the board:</p>
<ul><ul>
<li><b>Dramatic speed improvements</b></li>
<ul><ul>
<li><b>2x faster</b> login compared to traditional password authentication.</li></ul><ul>
<li><b>4x faster</b> login compared to username or mobile number with email or SMS OTP authentication.</li></ul><ul>
<li><b>6x faster</b> login compared to username, password, and SMS or authenticator OTP authentication.</li>
</ul></ul>
<li><b>Reduced support costs</b></li>
<ul><ul>
<li><b>Reduced password-related support requests</b>, especially for forgotten passwords.</li></ul><ul>
<li><b>Lower costs</b> associated with SMS-based 2FA, as existing users can onboard directly with passkeys.</li>
</ul></ul>
<li><b>Strong user adoption & enhanced security:</b></li>
<ul><ul>
<li><b>Passkey sign-ins doubled</b> in just 4 months, showing high user acceptance.</li></ul><ul>
<li>Users migrating to passkeys are <b>fully protected</b> from common phishing and password breach threats.</li></ul><ul>
<li>With <b>31% MoM adoption growth</b>, more users are benefiting daily from enhanced security against vulnerabilities like phishing and SIM swaps.</li>
</ul></ul></ul></ul>
<h2><span style="font-size: x-large;">Recommendations and best practices</span></h2>
<p>To successfully implement passkeys on Android, developers should consider the following best practices:</p>
<ul><ul>
<li><b>Leverage Android's Credential Manager API:</b></li>
<ul><ul>
<li>Credential Manager simplifies credential retrieval, reducing developer effort and ensuring a unified authentication experience.</li></ul><ul>
<li>Handles passwords, passkeys, and federated login flows in a single interface.</li>
</ul></ul>
<li><b>Ensure data encoding consistency while migrating from other FIDO authentication solutions:</b></li>
<ul><ul>
<li>Make sure you handle consistent formatting for all inputs/outputs while migrating from other FIDO authentication solutions such as FIDO security keys.</li>
</ul></ul>
<li><b>Optimize error handling and logging:</b>
<ul><ul>
<li>Implement robust error handling for a seamless user experience.</li></ul><ul>
<li>Provide localized error messages and use detailed logs to debug and resolve unexpected failures.</li>
</ul></ul>
</li><li><b>Educate users on passkey recovery options:</b></li>
<ul><ul>
<li>Prevent lockout scenarios by proactively guiding users on recovery options.</li>
</ul></ul>
<li><b>Monitor adoption metrics and user feedback:</b>
<ul><ul>
<li>Track user engagement, passkey adoption rates, and login success rates to keep optimizing user experience.</li></ul><ul>
<li>Conduct A/B testing on different authentication flows to improve conversion and retention.</li>
</ul></ul></li></ul></ul>
<p>Passkeys, combined with the <a href="https://developer.android.com/training/sign-in/passkeys" target="_blank">Android Credential Manager API</a>, offer a powerful, unified authentication solution that enhances security while simplifying user experience. Passkeys significantly reduce phishing risks, credential theft, and unauthorized access. We encourage developers to try out the experience in their app and bring the most secure authentication to their users.</p>
<h2><span style="font-size: x-large;">Get started with passkeys and Credential Manager</span></h2>
<p>Get hands on with passkeys and Credential Manager on Android using our <a href="https://github.com/android/identity-samples/tree/credman-compose" target="_blank">public sample code</a>.</p>
<p>If you have any questions or issues, you can share with us through the <a href="https://issuetracker.google.com/issues?q=1301097" target="_blank">Android Credentials issues tracker</a>.</p>Android Developershttp://www.blogger.com/profile/08588467489110681140[email protected]0