Research Home

Projects

Research Areas

Publications

Tools

Illinois Impact

 

 

 

AvaScholar

Visual Applications Framework in Immersive Virtual Educational Environments

AvaScholar (a portmanteau of Avatar and Scholar) leverages parallelism to make internet-based education work better with new tools for the instructor and the learner. The AvaScholar Instructor module creates real-time avatars and models of the instructor and any visual aids utilizing parallel 3-D capture and reconstruction software. This allows students to view the instructor or visual aids from any angle. The AvaScholar Learner module applies new parallel machine learning “soft biometrics” techniques that use each student’s webcam to help the instructor gauge how well large classes of remote students understand the material (through expression recognition and shrug detection) across various automatically recognized demographic groups such as age or gender.

The Benefit of Parallelism

Immersive interfaces with real-time emotive recognition and processing are excellent performance drivers for client-side parallel computing. The real-time and jitter-less constraints preclude migrating the entire computation to a cloud, and demand high performance visual algorithms in the client. From a research perspective, new work is required in a number of areas: computer vision, image processing, 3D graphics, data analysis, audio processing, natural language processing, speech recognition, and QoS-driven systems.

AvaScholar Visual Processing Components

The AvaScholar system includes components for high performance visual and emotive processing that are composed in real time. Components include face recognition and tracking, emotion and expression detection and tracking, 3D reconstruction, image feature detection, and scene rendering. We currently are working with the parallel implementation of KD-Tree, SIFT, SURF, SfM stereo reconstruction, and image stitching codes. The components provide the basis for the AvaScholar educational application as well as other immersive applications.

Application Framework

Project AvaScholar’s modular application framework will monitor and control the hardware and software components within the system. It will also implement the end user applications on top of the system components. APIs will be designed for initialization, control, data transfer, signaling, and monitoring of all system components. This allows components to be decoupled for development and testing, supports new devices and algorithms as they become available, and reduces dependence between components. We envision that the applications will be implemented in a high-level script language that will promote flexibility and rapid reconfiguration. The framework will be designed for flexibility and reusability to support both our proposed end user applications and unanticipated interactive applications that we expect will emerge.

For More Information

Contacts: John C. Hart (jch@illinois.edu), Sanjay Patel (sjp@illinois.edu), Minh Do (minhdo@illinois.edu), and Thomas Huang (t-huang1@illinois.edu)


I2PC IL LogoI2PC Illinois is a joint research effort of the Illinois Department of Computer Science, Department of Electrical and Computer Engineering, and the Coordinated Science Laboratory, with funding from corporate partner Intel. Its work is conducted by faculty members and graduate students from the computer science and electrical and computer engineering departments at the University of Illinois.