Emotio is an iOS app designed or people diagnosed with ASD (Autistic Spectrum Disorder) or for those that suffer of alexithymia, a personality trait cracterized by difficulties identifying, understanding, and describing emotions, both in oneself and others. This app detects facial emotions of people by using the phone's camera, and a mobile-optimized YOLOv11 model that classifies with low latency and high accuracy people's emotions based on facial expression. Additionally, this app has a behavioral suggestions engine, which generates on the spot helpful tips for the user on how to react to certain situations.
You can access the full project on GitLab at:
or
The app has been developed using Swift with the frontend being made with the help of SwiftUI framework.
As for the emotion classification machine learning model, the model has been trained using Google Colab, a python based notebook. For further details please refer to ultralytics documentation.
emotio/
├── yoloface/ # Main Project Components
│ ├── Assets.xcassets/ # Project Assets
│ ├── SVGs/ # Static media for emoticons & icons
│ ├── best_new.mlpackage # YOLOv11 classification model
│ └── Facenet6.mlmodel # Face recognition model
├── yolofaceTests/ # Unit and integration tests
├── yolofaceUITests/ # UI test scripts
└── yoloface.xcodeproj # Main project file
- Full Swift source code and Xcode project files
- Assets and resource files required by the app
- Configuration files (
Info.plist,.xcodeproj, etc.) - This README with build, install, and launch instructions
Note
No compiled binaries (such as .ipa or DerivedData) are included in this repository.
Important
In order to fully build the project and run the app, a MacOS compatible device is required alongside an iOS mobile phone. Android version of this app is planned for future realease.
To build the app from source, follow these steps:
-
Clone the repository:
git clone ssh://git@gitlab.upt.ro:8822/anthony.esber/emotio.git
or
git clone https://github.com/xatyy/emotio.git
-
Open the project in Xcode:
- Double-click the
.xcodeprojfile.
- Configure the signing settings:
- Go to the project settings in Xcode.
- Select your development team under Signing & Capabilities.
- Add your OpenAI API key
- Create a file
Secrets.xcconfig. - Add your key
OPENAI_API_KEY=sk-****************************. - Make sure the file is added to the project's build config.
- Build the app:
- From the top menu, select Product > Build (or use Cmd + B).
-
Connect your iOS device.
-
Install the app:
- In Xcode, choose your device.
- Select Product > Run (or use Cmd + R).
- Trust the app certificates
- Go to Settings > General > VPN & Device Management > your developer email > Trust App.
- App Launch:
- The app will launch automatically on the selected device after build and trusting the app.
- Xcode 14.0+
- macOS Ventura or newer
- iOS 15.0+ (or set your actual deployment target)
- A valid Apple Developer Account for code signing and deployment on a physical device
- Add Android compatibility.
- Increase model accuracy.
- Add text-to-speech for behavioral suggestions engine.
- The face recognition model has been taken from daduz11/ios-facenet-id
This project is licensed under the MIT License.
Developed as part of a Bachelor Thesis at Politehnica University of Timișoara, Faculty of Automation and Computers.
Made with <3 by xaty.
