Have you ever wondered if your smartphone can sense where you are? What you’re doing? Or what’s your nearby location? Thought of capturing an image, including the weather information and also combining your current situation together? At the recent Google I/O conference, Google has announced new tools for developers that can be customized to respond to the user’s current situation.
Google is introducing its Awareness API to make the smart features accessible. Think of you are jogging and your music application opens an energetic music playlist. Now let’s consider another scenario where you are driving nearby to any pharmacy and if it is open, then your system application will prompt an alert.
Functions like user location, user activity, and nearby locations are also performed by the API. Earlier, combining these functions using multiple API slow down the device speed. Also, smartphone suffers battery draining and slow processing issues. With the introduction of Google’s Awareness API, all functions will now be flawlessly accessible by combining API’s. Also, data can be interoperated between the Google devices and Android smart wearables.
This new Awareness API contains two API’s, first is Fence API which identifies the user’s current situation and second is Snapshot API that requests the information about the user’s current context.
Few instances shared by the Google are:
- Let’s say a weather application could easily sense Chrome cast plugged into your bedroom TV to display the current weather details on its screen.
- Suppose an alarm clock can wake you up the next morning based on your last sleep time and when will be your first meeting will be on that day.
- Even while jogging you forgot to put on your track application, then also the application automatically gets on.
For any app to succeed, it is important to define a smart business strategy, a professional UI/UX design and above all, an experienced android app development company to convert the idea into a working product!
No comments:
Post a Comment