This is a sample Unity project showing how to track button events and Mixed Reality event telemetry using Application Insights.
- Create new Application Insights service in Azure
- Copy and paste your Application Insights Instrumentation Id into the Application Insights Unity scene script.
- (Optional) You may also wish to add your App Id and (read only) Key for running Application Insights Query scripts.
Add one instance of the UnityApplicationInsights scripts into your main Unity scene.
-
Unity scene changes are automically captured as a PageView event.
-
To add logging to Application Insights for Unity UI buttons you can either manually attach the ButtonTrackerBehaviour to the Unity button, or enable the Add Tracker Behaviours setting in the Inspector which will attach the script automatically to all selectable game objects upon scene change.
Scene name | Description |
---|---|
Scene-UI | Unity UI Button event telemetry sample scene |
Scene-MR | Interactive hologram event sample scene for Mixed Reality |
In Application Insights Usage section you can visualise telemetry logged from your app.
Chart user flow across Unity scene changes and split by custom or interaction events.
View users and events during sessions.
Create funnels by creating step by step conditions to get conversion rates.
Review returning users over a period of time.
To create custom visualizations using all the data collected by Application Insights you can use the Ibex Dashboard. Fork the project and copy the Unity UI template or MR template file into the server/dashboards/preconfigured
directory. You should then be able to create your own dashboard by using these templates in the app.
Json.Net is currently required to serialize Dictionary objects in Unity.
Optional dependencies are included as git submodules which can be installed after cloning:
git submodule update --init --recursive
The Voronoi selection sample code is commented out inside the Scripts/MR/Tap.cs script.
Voronoi selection is useful in Mixed Reality scenarios for selecting holograms which may be small and close together or even overlap. In relation to capturing telemetry in MR scenarios we might want to capture Air Taps on holograms. But what happens when a user taps on void spaces which may have missed the target. We can log any useful metrics regarding nearest objects to Application Insights or even enable the closest object to be triggered.
I decided to remove the dependency on the Application Insights Plugins as it only supported Windows devices.
- Analytics for Mixed Reality
- MR and Azure 309: Application insights
- Using Voronoi selection in Mixed Reality
- Sending metrics to Application Insights
- Visual Studio AppCenter has released the AppCenter SDK for Unity with support for iOS, Android and Windows devices and there is documentation for getting started in Unity.