Forsip.de Homepage Home  Imprint  Pressinformation  Intern   
News About Staff Members Contact
Deutsch  English 
  SIKOWO
  SIPBILD
  SIPaDIM
  COSIMA
  SIPKIS
  SIPREACT
  SIPRUM
  TRUSTEE
   
 
SIPBILD
  SIPBILD 1
  Facial expression and gesture detection in video sequences  
Contact
externer Linksipbild@forsip.de

Management
externer LinkProf. Dr. Bernd Radig

Download
DownloadFlyer (pdf, german)
Publications
Publications for SIPBILD
Selected views of the press
Affective Computing: Rechner mit Gefühl
c't - magazin für Computertechnik 18/2004, Pages 88ff, D. Wiegand
Computer und Emotionen
c't magazin of Hessischer Rundfunk (HR), broadcast of September 18th 2004
10 Jahre abayfor Feier
Rundschau Magazin of Bayerisches Fernsehen,
July 26th 2004
Deine Zukunft auf Erfolgskurs:
Das Informatik-Studium an der TU München,
Schülerinformationsbro-
schüre, August 2005, S. 3 (german)
  1 Benefits  and Definition of the problem
2 Procedure  and Application
  3 Team and contact
 
  Procedure
MSnap your fingers and the light comes on
The user should be able to tell the computer what he wants in the easiest possible way. SIPBILD thus visually records the surroundings of a computer system - including the user - using video recordings. This information is then evaluated and provides the computer with the knowledge it needs to fulfil the user's wishes accordingly.

Detecting gestures
  The computer user can thus use gestures to trigger certain tasks. Gesture detection can be used for example to control the living comfort in high-tech houses. The user controls the lighting in the living room by giving instructions in the form of arm movements. Raising an arm signals to the computer "increase the brightness".

The procedure is as follows:

- Record the environment of a computer system by means of video sequences
- Calculate the differences from image to image
- Generate spatial and temporal movement masks
- Calculate movement features
- Classify the observed movements into instruction gestures
- Execute the corresponding command

Detecting facial expression
 


SIPBILD detects the raising of the arm and defines the movement as increase.
  Detecting the facial expression of a user can also be of great interest. A knowledge of the emotional state of the user can, for example, help the sales agent from the partial project COSIMA to optimise the sales negotiations.

How does SIPBILD work ?

A dot model of the main facial contours is initially generated for a neutral, laughing and surprised expression. This model is then compared with the respective contours from the recorded video sequences. The computer thus receives information on the emotional state of the user. After evaluating this data the sales agent from the partial project COSIMA can then determine whether the user is satisfied with the purchase made or product bought and optimise the sales negotiations if necessary.
 
dot model
A dot model is generated from the main facial contours. This model is compared with the currently recorded video sequences.
  Application
Part of a vision
  Adaptive action systems that record and react to the environment are also developed in the projects
SIPADIM and SIKOWO like in SIPBILD. Images and speech can be processed simultaneously through the interaction between these systems. This results in innovative application possibilities that should make your life easier and more comfortable. Adaptive and intuitive communication is indispensable, for example, to control service robots.
   
  Go to page
  1 Benefits  and Definition of the problem
2 Procedure  and Application
  3 Team and contact

 
   Up           Print version

Close window
SIKOWO 1 SIKOWO 2
  • Tracking persons
  • Image processing with several video cameras
  • Situational, personalised
  • Tracking of movement
  • Recognition of pointing actions
  • Control and regulation
Close window
SIPBILD 1 SIPBILD 2
  • Evaluating video sequences
  • Controlling complex systems through gestures and facial expressions
  • Application in environments with changing conditions
  • Recognizing more facial expressions and gestures.
Close window
SIPADIM-1 SIPADIM-2
  • Controlling complex systems
    through natural speech
  • Context-dependent dialogue
    management
  • Self-Explanation of the
    System
  • Guiding the user
  • Plan-based processing
Close window
COSIMA B2B COSIMA T
  • Sales agent
  • Fully automated
  • Sales psychology dialogue
  • Individual Travel
    Composition
  • COSIMA in Tourism
Close window
SIPKIS SIPKIS 2
  • Personalised financial
    services
  • Risk and flexibility taken
    into account
  • Situations and roles
  • selection of efficient and individual portfolios
  • legal regulations
  • taxes and social capital
Close window
SIPREACT SIPREACT 2
  • Adaptive documents and workflows
  • Recommender system prototype
  • Adaptive information and assistance systems
  • Tools and author support for document variants
Close window
Role and company modelling
  • Operational information systems that adapt to situations, roles and personal preferences
  • User-friendly, individualised man-machine dialogue
  • "Just-in-time" data acquisition to meet specific needs
Close window
TRUSTEE 1 TRUSTEE 2
  • Fiduciary offers, negotiations
    and conclusions for clients
  • Anonymity guaranteed on
    the Internet
  • Application-based selection
    of products
  • Help to decide the quality
    of extracted offers