M P E G - 4 Face and Body Animation Coding Applied to HCI Eric Petajan The Office of the Past Jiwon Kim, Steven M. Part HI Looking Ahead Vision-Based HCI Applications Eric Petajan Real-Time Inference of Complex Mental States from Facial Expressions and Head Gestures Rana el Kaliouby, Peter RobinsonĮpipolar Constrained User Pushbutton Selection in Projected Interfaces Amit Kale, Kenneth Kwan, Christopher Jaynes 141 Map Building from Human-Computer Interactions Artur M. Head and Facial Animation Tracking Using AppearanceAdaptive Models and Particle Filters Franck Davoine, Fadi DornaikaĪ Real-Time Vision Interface Based on Gaze Detection EyeKeys John J. Visual M o d e l i n g of D y n a m i c G e s t u r e s Using 3D A p p e a r a n c e and Motion Features Guangqi Ye, Jason J. Static H a n d P o s t u r e Recognition Based on Okapi-Chamfer Matching Harming Zhou, Dennis J, Lin, Thomas S. P a r t II Advances in R T V 4 H C I Recognition of Isolated Fingerspelling G e s t u r e s Using D e p t h Edges Rogerio Feris, Matthew Turk, Ramesh Raskar, Kar-Han Tan, Gosuke OhashiĪ p p e a r a n c e - B a s e d R e a l - T i m e U n d e r s t a n d i n g of G e s t u r e s Using P r o j e c t e d Euler Angles Sharat Chandran, Abhineet Sawaįlocks of F e a t u r e s for Tracking A r t i c u l a t e d Objects Mathias Kolsch, Matthew Turk Part I Introduction R T V 4 H C I : A Historical Overview Matthew Turk R e a l - T i m e A l g o r i t h m s : F r o m Signal Processing t o C o m p u t e r Vision Branislav Kisacanin, Vladimir Pavlovic To Saska, Milena, and Nikola BK To Karin, Irena, and Lara VP ToPei TSH The use in this publication of trade names, trademarks, service marks and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights. Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science + Business Media, Inc., 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Huang University of Illinois at Urbana-Champaign Library of Congress Cataloging-in-Publication Data A CLP. Huang University of Illinois at Urbana-Champaignīranislav Kisacanin Delphi Corporation Vladimir Pavlovic Rutgers University Thomas S. Time Vision for Human-Computer InteractionĮdited by Branislav Kisacanin Delphi Corporation
private HelloControl hello = new HelloControl() Īdd the following code to the ThisDocument_Startup event handler of the ThisDocument class or the ThisWorkbook_Startup event handler of the ThisWorkbook class. To add the user control to the actions paneĪdd the following code to the ThisDocument or ThisWorkbook class as a class-level declaration (do not add this code to a method). To show the actions pane, add the user control to the Controls property of the ThisDocument.ActionsPane field (Word) or ThisWorkbook.ActionsPane field (Excel). += new EventHandler(this.button1_Click)
You can place this code in the HelloControl constructor after the call to InitializeComponent.įor information about how to create event handlers, see How to: Create Event Handlers in Office Projects. In C#, you must add an event handler for the button click. Private Sub Button1_Click(ByVal sender As System.Object, ByVal e As System.EventArgs) _
private void button1_Click(object sender, System.EventArgs e) The following example shows code for a Microsoft Office Word document. If the control is not visible in the designer, double click HelloControl in Solution Explorer.Īdd the code to the Click event handler of the button.