From 1999-2005, I served as a professor and researcher at Universita' dell'Immagine - la Scuola di Formazione della Fondazione Industria ONLUS Milano. At UI, we had a Laboratory of Psychophysiology with a wide range of simulation and biofeedback equipment, includings virtual reality goggles, brain, heart, skin and muscle sensors. Our research was focused in a couple of key areas: (1) emotional response to images, (2) operator performance in automobiles and airplanes and (3) consumer response to products. Later UI incubated what would become Exmovere, the company I originally founded in 2003. Our first clients included the Big 3 in Detroit and since then I have worked on a wide variety of sensor formats that could pick up symptoms of driver fatigue, anxiety, anger and other threats to safety. Most of the solutions I have proposed have involved embedded ECG sensors, both contact and non-contact. I have also developed wearable solutions as you can see above, and hope to launch a new product in this area later this year. Credit and gratitude are in order to the founders and early partners of Exmovere and its various offshoots: Fabrizio Ferri, Tania Gianesin, Alina Lundry, Anna Barbara, Robert Doornick, Ronald H. Miller, Sunil Thakar, Hans-Joachim Ruff, William Giroldini, Phil Moinot, Kamran Fallahpour, Jeong-Hwan Kim, Paul Tulipana, Jon Kawa, Nicholas Senske, Blaise Boscaccy, Junfeng Chen and so many other colleagues for contributing their sweat, blood and tears to try to push forward the cause of these specific projects and automotive psychophysiology as a field. We haven't made it there yet, but the world has stopped dismissing these ideas outright.
Neuromarketing: Consumer Emotions Research
Gaming Biosensor Research
In 1999-2005, before beginning the development of automotive biosensors, I worked with William Giroldini to develop video game controllers with embedded biosensors so we could observe vital signs of players and provide biofeedback.
The main barrier to neuromarketing in 1999-2005 was the lack of comfortable, non-clinical-looking sensors. Starting in 2005, I began working with wireless Brainquiry PET series GSR/EEG/ECG sensors. These enabled us to test subject emotion responses in concerts, political rallies, stores and other worldized experiences. The next problem became synchronization of data and stimuli.
The most interesting project that William Giroldini and I worked on was the creation of Mayafilm: PC software than blended biofeedback and video editing tools. The idea was to be able to scroll through emotional responses of test subjects on a timeline, the way one would do so in Avid or Final Cut. Mayafilm enabled us to try to more directly correlate individual and group responses to various images.
Mayafilm in the Cockpit
Mayafilm in the Simulator
Mayafilm for Live Performances