In this blog post, you will learn how to Recognise emotions in images using Cognitive Service in Xamarin forms.
Introduction
Xamarin.Forms code runs on multiple platforms - each of which has its own filesystem. This means that reading and writing files is most easily done using the native file APIs on each platform. Alternatively, embedded resources are a simpler solution to distribute data files with an app.
Cognitive Services
Infuse your apps, websites, and bots with intelligent algorithms to see, hear, speak, understand and interpret your user needs through natural methods of communication. Transform your business with AI today.
Use AI to solve business problems
- Vision
- Speech
- Knowledge
- Search
- Language
- Emotion API takes a facial expression in an image as an input and returns the confidence across a set of emotions for each face in the image, as well as bounding box for the face, using the Face API. If a user has already called the Face API, they can submit the face rectangle as an optional input.
- Emotion API is emotions detected are anger, contempt, disgust, fear, happiness, neutral, sadness and surprise. These emotions are understood to be cross-culturally and universally communicated with particular facial expressions.
- Visual Studio 2017(Windows or Mac)
- Emotion API Key
Start by creating a new Xamarin.Forms project. you’ll learn more by going through the steps yourself.
Choose the Xamarin.Forms App Project type under Cross-platform/App in the New Project dialog.
Name your app, select “Use .NET Standard” for shared code, and target both Android and iOS.
You probably want your project and solution to use the same name as your app. Put it in your preferred folder for projects and click Create.
You now have a basic Xamarin.Forms app. Click the play button to try it out.
Get Emotion API Key
In this step, get Emotion API Key. Go to the following link.
https://azure.microsoft.com/en-in/services/cognitive-services/
Click "Try Cognitive Services for free".
Now, you can choose Face under Vision APIs. Afterward, click "Get API Key".
Read the terms, and select your country/region. Afterward, click "Next".
Now, log in using your preferred account.
Now, the API Key is activated. You can use it now.
The trial key is available only 7 days. If you want a permanent key, refer to the following article.
Getting Started With Microsoft Azure Cognitive Services Face APIs
Setting up the User Interface
Go to MainPage.Xaml and write the following code.
MainPage.xaml
Click the play button to try it out.
NuGet Packages
Now, add the following NuGet Packages.
- Xam.Plugin.Media
- Newtonsoft.Json
In this step, add Xam.Plugin.Media to your project. You can install Xam.Plugin.Media via NuGet, or you can browse the source code on GitHub.
Go to Solution Explorer and select your solution. Right-click and select "Manage NuGet Packages for Solution". Search "Xam.Plugin.Media" and add Package. Remember to install it for each project (PCL, Android, iO, and UWP).
Permissions
In this step give the following required permissions to your app:
Permissions - for Android
- CAMERA
- READ_EXTERNAL_STORAGE
- WRITE_EXTERNAL_STORAGE
- NSCameraUsageDescription
- NSPhotoLibraryUsageDescription
- NSMicrophoneUsageDescription
- NSPhotoLibraryAddUsageDescription
In this step, you can create a model for Deserialize your response.
ResponseModel.cs
Emotion Recognition
In this step, write the following code for Emotion Recognition.
MainPage.xaml.cs
Click the play button to try it out.
Happy Coding...!😊
I hope you have understood how to Recognise emotions in images using Cognitive Service in Xamarin.Forms.
Thanks for reading. Please share comments and feedback.