The importance of nonverbal communication has been well established through several theories including Albert Mehrabian's 7-38-55 rule that proposes the respective importance of semantics, tonality and facial expressions in communication. Although several studies have examined how emotions are expressed and preceived in communication, there is limited research investigating the relationship between how emotions are expressed through semantics and facial expressions. Using a facial expression analysis software to deconstruct facial expressions into features and a K-Nearest-Neighbor (KNN) machine learning classifier, we explored if facial expressions can be clustered based on semantics. Our findings indicate that facial expressions can be clustered based on semantics and that there is an inherent congruence between facial expressions and semantics. These results are novel and significant in the context of nonverbal communication and are applicable to several areas of research including the vast field of emotion AI and machine emotional communication.
Recordings from both layers of the flexible μECoG array showed frequency features typical of cortical local field potentials (LFP) and were shown to be stable in amplitude over time. Recordings from both layers also showed consistent, frequency-dependent modulation after induction of general anesthesia, with large increases in beta and gamma band and decreases in theta band observed over three experiments. Recordings from conventional μECoG arrays over human cortex showed robust modulation in a high frequency (250-2000 Hz) band upon production of spoken words. Modulation in this band was used to predict spoken words with over 90% accuracy. Basal Ganglia neuronal AP firing was also shown to significantly correlate with various cortical μECoG recordings in this frequency band. Results indicate that μECoG surface electrodes may detect high frequency neuronal activity potentially associated with AP firing, a source of information previously unutilized by these devices.