EduSense is open sourced with several goals in mind. Foremost, we hope that others will deploy the system and gain value not only in the data generated, but also as an opportunity to engage with topics surrounding smart classroom sensing (e.g., responsive pedagogy, professional development, privacy, automation, sensing fidelity). Second, EduSense can serve as a comprehensive springboard for testing and continually refining the underlying computer vision algorithms in the classroom (and similar settings). Finally, we designed EduSense with modularity in mind, hoping to cultivate a community that can help improve and contribute new features and modules. All source code and instructional material can be found on our git repo: https://github.com/edusense/edusense and https://github.com/edusense/ClassroomDigitialTwins
EduSense works with most wired and wireless cameras (via RTSP, a common protocol). We recommend POE IP cameras, as this provides both power and connectivity. If you are looking to purchase a cameras for testing, Lorex's LNE8974BW is a good starting point (~$200 on Amazon), offering a 102° field of view and built in audio.
If you use our models or code, we would appreciate if you can use this citation:
Karan Ahuja, Dohyun Kim, Franceska Xhakaj, Virag Varga, Anne Xie, Stanley Zhang, Jay Eric Townsend, Chris Harrison, Amy Ogan, and Yuvraj Agarwal. 2019. EduSense: Practical Classroom Sensing at Scale. Proc. ACM Interact. Mob. Wearable Ubiquitous Technol. 3, 3, Article 71 (September 2019), 26 pages. DOI: https://doi.org/10.1145/3351229
Karan Ahuja, Deval Shah, Sujeath Pareddy, Franceska Xhakaj, Amy Ogan, Yuvraj Agarwal, and Chris Harrison. 2021. Classroom Digital Twins with Instrumentation-Free Gaze Tracking. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems (CHI '21). Association for Computing Machinery, New York, NY, USA, Article 484, 1–9. DOI:https://doi.org/10.1145/3411764.3445711