Real-Time and Accurate Full-Body Multi-Person Pose Estimation&Tracking System
-
Updated
May 13, 2024 - Python
Real-Time and Accurate Full-Body Multi-Person Pose Estimation&Tracking System
Awesome work on hand pose estimation/tracking
Computer Vision library for human-computer interaction. It implements Head Pose and Gaze Direction Estimation Using Convolutional Neural Networks, Skin Detection through Backprojection, Motion Detection and Tracking, Saliency Map.
Orchestra is a human-in-the-loop AI system for orchestrating project teams of experts and machines.
Notes for Human Computer Interaction course - CS6750
Code for CVPR'18 spotlight "Weakly and Semi Supervised Human Body Part Parsing via Pose-Guided Knowledge Transfer"
Easy to use Python command line based tool to generate a gaze point heatmap from a csv file. 👁️
👀 Use machine learning in JavaScript to detect eye movements and build gaze-controlled experiences.
openEMSstim: open-hardware module to adjust the intensity of EMS/TENS stimulators.
VR driving 🚙 + eye tracking 👀 simulator based on CARLA for driving interaction research
A curated list of awesome affective computing 🤖❤️ papers, software, open-source projects, and resources
Wearable computing software framework for intelligence augmentation research and applications. Easily build smart glasses apps, relying on built in voice command, speech recognition, computer vision, UI, sensors, smart phone connection, NLP, facial recognition, database, cloud connection, and more. This repo is in beta.
Github page for students in HCI courses at Handong University
Code and data belonging to our CSCW 2019 paper: "Dark Patterns at Scale: Findings from a Crawl of 11K Shopping Websites".
The official implementation for ICMI 2020 Best Paper Award "Gesticulator: A framework for semantically-aware speech-driven gesture generation"
This repository contains the Domain Discovery Tool (DDT) project. DDT is an interactive system that helps users explore and better understand a domain (or topic) as it is represented on the Web.
Fist, palm and hand detection & tracking for intelligent human-computer interaction game character movement control with OpenCV on Java (Processing sketchbook).
The python code detects different landmarks on the face and predicts the emotions such as smile based on it. It automatically takes a photo of that person when he smiles. Also when the two eyebrows are lifted up, the system plays a music automatically and the music stops when you blink your right eye.
This is the research repository for Vid2Doppler: Synthesizing Doppler Radar Data from Videos for Training Privacy-Preserving Activity Recognition.
Website of the Georgia Tech Visualization Lab
Add a description, image, and links to the human-computer-interaction topic page so that developers can more easily learn about it.
To associate your repository with the human-computer-interaction topic, visit your repo's landing page and select "manage topics."