A curated list of egocentric (first-person) vision and related area resources
-
Updated
Oct 14, 2024
A curated list of egocentric (first-person) vision and related area resources
[NeurIPS 2024] Official code for HourVideo: 1-Hour Video Language Understanding
A repo for training and finetuning models for hands segmentation.
[CVPR 2022] Joint hand motion and interaction hotspots prediction from egocentric videos
Action Scene Graphs for Long-Form Understanding of Egocentric Videos (CVPR 2024)
[MICCAI2023 Oral] POV-Surgery: A Dataset for Egocentric Hand and Tool Pose Estimation During Surgical Activities
Code for the paper "Differentiable Task Graph Learning: Procedural Activity Representation and Online Mistake Detection from Egocentric Videos" [NeurIPS (spotlight), 2024]
The official code and data for paper "VidEgoThink: Assessing Egocentric Video Understanding Capabilities for Embodied AI"
Making a long story short: A multi-importance fast-forwarding egocentric videos with the emphasis on relevant objects @ Journal of Visual Communication and Image Representation 53 (2018)
This is a third party implementation of the paper "The Audio-Visual Conversational Graph: From an Egocentric-Exocentric Perspective".
A Weighted Sparse Sampling and Smoothing Frame Transition Approach for Semantic Fast-Forward First-Person Videos @ IEEE Conference on Computer Vision and Pattern Recognition (CVPR) 2018
Official code release for "Generative Adversarial Network for Future Hand Segmentation from Egocentric Video" (ECCV 2022)
Collect related papers and datasets for research
CVMHAT: Multiple Human Association and Tracking from Egocentric and Complementary Top Views, IEEE TPAMI.
Add a description, image, and links to the egocentric-videos topic page so that developers can more easily learn about it.
To associate your repository with the egocentric-videos topic, visit your repo's landing page and select "manage topics."