姬艳丽T成电 发表于 2015-12-8 10:00:01

电子科大机器人中心人与人交互活动行为数据库CR-UESTC interaction database

本数据库用Kinect2传感设备采集了日常生活中人-人交互行为的RGB,depth 和Skeleton数据,用于验证针对行为识别的算法。下面是对数据库的描述:
We capture the RGB, depth and skeleton information of 10 interaction categories using Microsoft Kinect 2.0. The 10 interaction categories include handshaking, pushing, punching, arm around shoulder, kicking, high ve, handover, bowing, hugging and patting shoulder. Except interaction categories in existing databases (i.e. UT database, TV interaction database, SBU database), we also include two new
categories in our database, arm around shoulder and patting shoulder. More than 15 persons attend the data capturing, and compose 25 pairs of persons.

In CR-UESTC database, each category is acted by 25 pairs of persons. The database is captured in an indoor environment, and the background includes several persons who are running or playing table tennis, and some static desks and chairs. RGB videos in our database have a resolution of 960*540 pixels on each frame. The depth data has a resolution of 320*240 pixels, and each pixel has a 13-bit depth value. The skeleton of one person contains 25 body joints, and they are spinebase, spinemid, neck, head, shoulderleft, elbowleft, wristleft, handleft, shoulderright, elbowright, wristright, handright, hipleft, kneeleft, ankleleft, footleft, hipright, kneeright, ankleright, footright, spineshoulder, handtipleft, thumbleft, handtipright, thumbright. Each joint is described by a 3-dimensional coordinate vector, (x; y; z), in a real world coordinate system.
The database is available at http://www.uestcrobot.net/?q=download。
欢迎行为识别领域的老师同学使用。

刘培四川大学 发表于 2016-4-28 10:50:52

谢谢!:)
页: [1]
查看完整版本: 电子科大机器人中心人与人交互活动行为数据库CR-UESTC interaction database