Gaze Estimation on Handheld Mobile Devices
Eye-Tracking Continous Calibration on Mobile
Gaze-Based Interaction
Context Awareness by Sensors
Eye-Movement as Biometrics
Biometrics
Deepfake detection
Privacy and Security for Handheld Mobile Devices
UKRI BBSRC, Eye Movement for Mental Fatigue Project, Total $1.6M funding
June 2025 - Present
Role: Research Fellow for eye movement analysis and computer vision research
UKRI AI Research Resource Isambard-AI GPU Supercomputer
Oct 2025 - Jan 2026
Resources: 10,000 GPUhs
Role: PI
Innovate UK CyberASAP Funding
April 2024 - Jan 2025
Innovate UK-sponsored Cyber Security Academic Startup Accelerator Programme (CyberASAP)
Total funding of £120,000 awarded.
Role: PI
Gaze Estimation for Unconstrained Eye Tracking on Handheld Mobile Deivces / Any Surface
Deep learning model-driven Eye-Tracking System on iOS & Android, Done
Exploring Robust Gaze Interaction Methods in Mobile, Done
Exploring the Factor which impact on gaze estimation / eye tracking on Mobile, Done, under peer review
TensorFlow Lite / PyTorch version Gaze estimation library, Done, waiting for submit/open-source (under IP checking)
Solve dynamic impacts of Eye Tracking by:
A new gaze estimation dataset with sensor data, Done, private source
A new continue-learning based calibration approach for in-the-wild, Done, under peer review
A Multi-model gaze estimation model project, Done
Gaze for Security & Privacy
Gaze and HAR in camera-based devices, Done, under peer review
A Security and Privacy, Done, under peer review
Social Gaze for Privacy, Done (conduct interview with expert and general public)
Now exploring eye movement for Cybersecurity, Funded by Innovate UK
Project 1 done
Project 2 done
Project 3 done
Our method perform top on benchamrks
Project 4 done
Our method perform top on benchamrks
Project 5 on-going
Project 6 on-going