Difference between revisions of "Projects:Software framework for multi-sensory environments"

From Collective Computational Unit
Jump to navigation Jump to search
(Contact)
 
(4 intermediate revisions by 2 users not shown)
Line 13: Line 13:
  
 
== Contact ==
 
== Contact ==
* Hemal Naik, hnaik@ab.mpg.de
+
* [[Hemal Naik]], hnaik@ab.mpg.de
* Mathias Günther (long term support), mathias.guenther@uni-konstanz.de
+
* [[Mathias Günther]]
  
 
== Collaborators ==
 
== Collaborators ==
Line 39: Line 39:
  
 
'' Note : The features are in constant development and the page is updated every month ''
 
'' Note : The features are in constant development and the page is updated every month ''
 +
 +
== Milestones ==
 +
 +
The first soft-release with example code is planned on 22 Dec. It will be done during the presentation with researchers from the cluster.
 +
Ideally, we will introduce basic repository with preliminary functionalities that can be utilized immediately.
 +
 +
  
 
== Example code ==
 
== Example code ==
Line 51: Line 58:
 
== Test Data ==
 
== Test Data ==
  
The sample datasets will be available soon
+
The sample datasets will be available soon for trying out different stuff.
  
 
PigeonPostureDataset  
 
PigeonPostureDataset  
 +
 
StarlingDataset
 
StarlingDataset
 +
 +
PigeonGazeDataset
  
 
== Example Projects ==
 
== Example Projects ==

Latest revision as of 16:34, 8 December 2020

Contents

Overview

The purpose of this project is to develop of a software framework that will allow effective use of experimental facilities with multiple sensors e.g. imaging barn. The framework will standardize the process of synchronized data collection, data sharing (formats) and data manipulation (processing). Such standardization will promote collaborative development and support technology or methods transfer between different facilities i.e. Barn, Imaging Hangar, human tracking facility at Psychology Dept.

Closely related to Projects:Improve tracking of individual markers and marker patterns and Projects:Augment marker tracking with visual tracking.

Details:File:Cluster-Medium-Project-Grant HemalNaik.pdf

Contact

Collaborators

  • Oliver Deussen (host)
  • Iain Couzin
  • Britta Renner
  • Mate Nagy, mnagy@ab.mpg.de
  • Bastian Goldluecke, bastian.goldluecke@uni-konstanz.de

Aims

The main aim of the project is to have a repository with useful code snippets that allow users to quickly use the data generated in the imaging facilities. The code will ideally work as base code on top of which user can build their own project.

Key Features

  • Reading data generated by VICON mo-cap system in the Imaging Barn.
  • Augmenting 3D data on videos
  • Annotation of keypoints on video
  • Verification of 6-DOF patterns
  • 6-DOF pose independent of VICON system
  • Stereo triangulation of points from video
  • 3D-Transformations in different coordinate systems
  • Reading data stream over network for real-time experiments [Info]

Note : The features are in constant development and the page is updated every month

Milestones

The first soft-release with example code is planned on 22 Dec. It will be done during the presentation with researchers from the cluster. Ideally, we will introduce basic repository with preliminary functionalities that can be utilized immediately.


Example code

  • Realtime stream reading with VICON SDK
  • Custom 6-DOF tracking with VICON 3D Points and comparison with VICON 6-DOF tracking
  • Realtime stream reading without VICON SDK
  • Manual annotation of custom frames
  • Creating 2D-3D annotation datasets with manual annotation or vicon positions.
  • Stereo triangulation

Test Data

The sample datasets will be available soon for trying out different stuff.

PigeonPostureDataset

StarlingDataset

PigeonGazeDataset

Example Projects

These projects will be presented in complete form at the end of the project.

  • Adding External camera calibration [ Showing hardware integrations ]
  • AR visualization of 3D movement [ Showing Interactive vislualization capabilities]
  • Interactive drone flight and recording data from area of interest interactively [ Showing real-time control and optimized data capturing ]
  • Live tracking of posture [ Showing ML capabilities]