Pseudo-Saliency for Human Gaze Simulation

dc.contributor.advisorFaloutsos, Petros
dc.contributor.authorCaruana, Peter Nicholas
dc.date.accessioned2022-12-14T16:43:43Z
dc.date.available2022-12-14T16:43:43Z
dc.date.copyright2022-09-29
dc.date.issued2022-12-14
dc.date.updated2022-12-14T16:43:42Z
dc.degree.disciplineComputer Science
dc.degree.levelMaster's
dc.degree.nameMSc - Master of Science
dc.description.abstractUnderstanding and modeling human vision is an endeavor which can be; and has been, approached from multiple disciplines. Saliency prediction is a subdomain of computer vision which tries to predict human eye movements made during either guided or free viewing of static images. In the context of simulation and animation, vision is often also modeled for the purposes of realistic and reactive autonomous agents. These often focus more on plausible gaze movements of the eyes and head, and are less concerned with scene understanding through visual stimuli. In order to bring techniques and knowledge over from computer vision fields into simulated virtual humans requires a methodology to generate saliency maps. Traditional saliency models are ill suited for this due to large computational costs as well as a lack of control due to the nature of most deep network based models. The primary contribution of this thesis is a proposed model for generating pseudo-saliency maps for virtual characters, Parametric Saliency Maps (PSM). This parametric model calculates saliency as a weighted combination of 7 factors selected from saliency and attention literature. Experiments conducted show that the model is expressive enough to mimic results from state-of-the-art saliency models to a high degree of similarity, as well as being extraordinarily cheap to compute by virtue of being done using the graphics processing pipeline of a simulation. The secondary contribution, two models are proposed for saliency driven gaze control. These models are expressive and present novel approaches for controlling the gaze of a virtual character using only visual saliency maps as input.
dc.identifier.urihttp://hdl.handle.net/10315/40788
dc.languageen
dc.rightsAuthor owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.
dc.subjectComputer science
dc.subject.keywordsSaliency
dc.subject.keywordsGaze
dc.subject.keywordsVirtual humans
dc.subject.keywordsSimulation
dc.subject.keywordsSALICON
dc.subject.keywordsSaliency maps
dc.subject.keywordsSaliency map
dc.subject.keywordsGaze control
dc.subject.keywordsVirtual agents
dc.subject.keywordsHuman gaze
dc.subject.keywordsVirtual pedestrians
dc.subject.keywordsPedestrians
dc.subject.keywordsCasual pedestrians
dc.titlePseudo-Saliency for Human Gaze Simulation
dc.typeElectronic Thesis or Dissertation

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Thesis__2.2_.pdf
Size:
4.78 MB
Format:
Adobe Portable Document Format
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
license.txt
Size:
1.87 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
YorkU_ETDlicense.txt
Size:
3.39 KB
Format:
Plain Text
Description:

Collections