The development of realtime photorealism in 3D
imagery paired with an increasing interest in
both TV-series and video games, this project
explores ways of adapting photorealistic
narratives in realtime, depending on the viewers
facial expression. Through an analysis, image
processing, branching narratives and realtime
rendering technology was explored. A product
was created through an iterative development as
a proof of concept. This product would detect a
user's smile and change the weather in a virtual
environment. The product was tested on 34
participants, who were intrigued and generally
liked the concept. It was further evaluated by an
expert, who confirmed that it served as a proof
of concept. In the future the product should be
able to detect more facial expressions and adapt
the narrative unbeknownst to the viewer.
However, further research, development and
testing would be needed.
On this project, I was main developer for the image processing and local network section.
Link to the project on the university's website