Article in Android category.
Google I/O 2018 Focused On Android AI Development
Google providing toolkits for Android AI development for mobile developers was a key takeaway from the I/O 2018 presentation.
Last year at Google I/O 2017, Google announced that the company was shifting from "mobile first" to "AI first". At Google I/O 2018, the company took significant steps towards executing on that strategy by announcing toolkits and programs that will further streamline the Android AI development process. There were also some great app design updates to Material Design. New app development toolkits like Android Jetpack were shown and made available to developers immediately following the keynote. These advancements are pushing the Android platform forward and streamline our app development processes.
Android Jetpack Is A Unified Toolkit for Android Developers
"There are 6 ways to do anything on Android" is a running joke in the Android community. Google took note and announced Android Jetpack, a unified toolkit for Android developers to implement everything from tests to navigation and a local database in a standardized infrastructure. One of our favorite tools in Jetpack is Slices, which ads UI templates to search and Google Assistant. Now we can create voice-activated functions for any app, creating AI possibilities for every single project going forward.
ML Kit Is Bringing Machine Learning AI To Mobile
Google announced ML Kit, which will bring machine learning to mobile. The base APIs will let Android and iOS developers easily include image labeling, text recognition, face detection, barcode scanning, and landmark detection to their app, with more features forthcoming. The most intriguing aspect of this news is that it will be distributed as a Firebase SDK, making ML Kit integrated with Google’s top mobile app development hub.
Dialogflow Updates to Google Assistant AI
Dialogflow serves as the underlying tech for many of the Google Assistant AI improvements announced at I/O 2018. Continued Conversation lets users have dialogues with their Assistants without the need to preface every sentence with “Hey, Google.” The ability to create custom Routines and make multiple requests in one voice command are great quality of life improvements that open the door for AI development for us.
Now, Google Duplex could be weird for a lot of people but these more iterative updates, in conjunction with the tools Google is implementing to simplify Android AI development, could open the possibilities for Google Assistant integration with our apps.
Android Things Lets Us Create For Smart Devices
Android Things has been in the discussion for a year but was just officially launched. Android Things is a version of the Android OS tailored to Smart Device development. Things allows you to write firmware for smart devices using Android tools. This will let our team use toolsets they have a deep familiarity with, like Firebase and ML Kit, to build AI into smart devices.
Material Theming Makes It Easier To Turn Our Wireframes Into Apps
In the realm of design, Google announced the addition of Material Theming to Material Design, the Google visual design language. The Theme Editor allows developers to unify design across platforms. This makes it easier and faster for us to turn a wireframe build with material components into a customized and beautiful design and import it into a web, iOS or Android app. It is only available in Sketch right now, but as Sketch aficionados, this helps our designers and developers work together to create beautiful and functional products.
How Google's Android AI Development Announcements Impact Fueled
Google I/O 2018 showed a lot off for the all teams here at Fueled. These updates all seem to help streamline the Android design and development process for our team. As a forward facing company, finding more accessible ways to integrate AI tech into our apps gets us pumped up and ready to keep putting our expert design and Android development skills to the test.