Filtering by
- Resource Type: Text
![168329-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2022-08/168329-Thumbnail%20Image.png?versionId=7r7T_JnIxlKjKvHtYhiprVxMFT_vdBsf&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240616/us-west-2/s3/aws4_request&X-Amz-Date=20240616T005234Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=35e4aee9ddf195c2602305b0061a2da38bef9b78d3168ea4330399e099faf870&itok=4L9nGrCC)
![165924-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2022-05/165924-thumbnail-image.png?versionId=m5FqeQUs_6Lk9k_AYTfES84Z6BFdq46t&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240616/us-west-2/s3/aws4_request&X-Amz-Date=20240616T000311Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=eb662c6ed93b3adcec0778cd0d357b83917f9f01a0b825886e9e9ef1603acfb0&itok=gaQR5yR8)
The importance of nonverbal communication has been well established through several theories including Albert Mehrabian's 7-38-55 rule that proposes the respective importance of semantics, tonality and facial expressions in communication. Although several studies have examined how emotions are expressed and preceived in communication, there is limited research investigating the relationship between how emotions are expressed through semantics and facial expressions. Using a facial expression analysis software to deconstruct facial expressions into features and a K-Nearest-Neighbor (KNN) machine learning classifier, we explored if facial expressions can be clustered based on semantics. Our findings indicate that facial expressions can be clustered based on semantics and that there is an inherent congruence between facial expressions and semantics. These results are novel and significant in the context of nonverbal communication and are applicable to several areas of research including the vast field of emotion AI and machine emotional communication.
![191018-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2024-05/191018-Thumbnail%20Image.png?versionId=DJSgGyfmayz8Gh46nSukwvkZKu452pJU&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240616/us-west-2/s3/aws4_request&X-Amz-Date=20240616T114108Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=5d4c5e4ff7b1dcc00d1b7fc700fc0226673d36ae30aaf6c6ea9be1d76f77e3b8&itok=aoCjo7VF)
![193402-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2024-05/193402-Thumbnail%20Image.png?versionId=PVlphA2UAwC9qoXV34nvh6RG8heTR3FC&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T130731Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=684cb988a77b5c5055ec0ed7ac05e1722d0b45599db285d7ffd641ac1d5ec454&itok=ADYWj99B)
![187370-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2023-06/187370-Thumbnail%20Image.png?versionId=PrALXaC8iNdwiMAXC.KzaXOL24Bdy826&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T090635Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=d6f6745394772c096e3f7939007d25862c66471aae88e7d10430ef72d18ec156&itok=RQR4k5Gf)
![156607-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-09/156607-Thumbnail%20Image.png?versionId=SJswgdV6bkGRN4FRGH1K2sZyPv8Jce1A&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T092000Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=0a4ae837fe989f8e3ef0b0203d993a081d374e1b15a16ff150359f5efb257314&itok=JbmGo3Cf)
![157218-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-09/157218-Thumbnail%20Image.png?versionId=Bsl2DnVeNB4IkoEXBT_H2Jc5406uRQNO&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T101858Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=554d226a7703072a187e0e675fab0b0b2999a28a2d9e314787b6edc4febe8155&itok=cDBj4sWY)
![156944-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-09/156944-Thumbnail%20Image.png?versionId=ZQd8sL8TzkKbI3l0sYNF1Bmz0aeoI94b&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T101858Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=1a8568d25feab0b02375f83ee2bff32149b25425897e071d93ef0197a0eefe47&itok=rRJlYSo7)
Recordings from both layers of the flexible μECoG array showed frequency features typical of cortical local field potentials (LFP) and were shown to be stable in amplitude over time. Recordings from both layers also showed consistent, frequency-dependent modulation after induction of general anesthesia, with large increases in beta and gamma band and decreases in theta band observed over three experiments. Recordings from conventional μECoG arrays over human cortex showed robust modulation in a high frequency (250-2000 Hz) band upon production of spoken words. Modulation in this band was used to predict spoken words with over 90% accuracy. Basal Ganglia neuronal AP firing was also shown to significantly correlate with various cortical μECoG recordings in this frequency band. Results indicate that μECoG surface electrodes may detect high frequency neuronal activity potentially associated with AP firing, a source of information previously unutilized by these devices.
![156964-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-08/156964-Thumbnail%20Image.png?versionId=FsNEWjbe0GdUc9grvL7fg5Ptha1N4u3R&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240617/us-west-2/s3/aws4_request&X-Amz-Date=20240617T135721Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=ec09123ca35e07a47b991a272a612a99e0850ffbd653a8322bf366371b0fded6&itok=GLIs-lp3)
The effect of neuromodulation on proprioceptive sensitivity was assessed using transcutaneous electrical nerve stimulation (TENS), which has been shown to have beneficial effects on human cognitive and sensorimotor performance in other contexts. In this pilot study the effects of two frequencies (30hz and 300hz) and three electrode configurations were examined. No effect of electrode configuration was found, however sensitivity with 30hz stimulation was significantly lower than with 300hz stimulation (which was similar to sensitivity without stimulation). Although TENS was shown to modulate proprioceptive sensitivity, additional experiments are required to determine if TENS can produce enhancement rather than depression of sensitivity which would have positive implications for rehabilitation of proprioceptive deficits arising from stroke and other disorders.