Date : 23 June (Singapore) and 26 June (Malaysia)
Singapore - Microsoft Singapore
Malaysia - Microsoft Malaysia
8.30 am - 5pm (Singapore)
9.30 am - 4.30pm (Malaysia)
Register @ https://insiderdevtour.com/
In my previous article, I gave a brief introduction to Azure and Cognitive Services, and I made the analysis of the photos with Xamarin. Today, with Text Analytics at Cognitive Services, we will provide analysis of what we have received (ie text) from the user and check the emotion state, topic titles, and which dictation it is in.
What will this provide us? In what emotions, on which dots are the most used, what topics are emphasized, which words are used the most when the user uses the application, and so on. and we will be able to collect more data from our users and improve our practice in the direction of our shortcomings. Then let's go to practice
First we create a service that we will use on Azure. Remember to note the Key after you create it.
Then we open the PCL project and add Microsoft.Bcl.Build, Microsoft.Bcl, HttpClient and Newtonsoft.JSON packages from NuGet to the project. These other first if what he's writing I would recommend reading.
After that, I wrote Xamarin with MVVM for readability. I've also used custom classes like ListViewItemSelected because I'm using MVVM. First of all, I created models to send as body.
I created a separate body for Language Detect.
I have also created separate models for this and returning data. I would recommend the site json2csharpfor creating these models .
We sent our data in Xamarin and wrote the service to get the results (if you do not know this part, do not hang it up too much). We need to serialize the data you get here to the body model. We will receive an answer based on the data we send as servise json.
Do not show how it works