![135560-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-07/135560-Thumbnail%20Image.png?versionId=Dhq3jUEdS5oL8Q7W7hP4uge89dfQNRY7&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T094940Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=5f848785b025566809dfbf7484e6151f69f743cc5cc7ae6c65c88b3dd313013b&itok=ds6W34ru)
![135691-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/135691-Thumbnail%20Image.png?versionId=mMvJT2VOx0Lwh2qTVvsJCR6k0ctzvE91&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240530/us-west-2/s3/aws4_request&X-Amz-Date=20240530T153906Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=ffe37870443e0666ebf6b5ffaf9c030258d77f5085ae39c22b6b1efe765950ea&itok=SRupt0Pr)
![136820-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/136820-Thumbnail%20Image.png?versionId=7ZU6y5JcpU8KCeaQY2.maeb6TmMfmVyw&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240615/us-west-2/s3/aws4_request&X-Amz-Date=20240615T040811Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=9d377ddbd715c05d4ac81e3776d76a07e2b71b02122b1500f7372de66d74d381&itok=bNmm002P)
![136827-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/136827-Thumbnail%20Image.png?versionId=lcebFcJIAMw91E2SJNPmKqU_AmN1KsBH&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240605/us-west-2/s3/aws4_request&X-Amz-Date=20240605T234112Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=4e7543e2996575b3dacb51e590c17b9bb98ebfcb2f830bfb0ab42a66ff6971cf&itok=nVLUvOIl)
![137043-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137043-Thumbnail%20Image.png?versionId=NmbaXdLx2CdKLhSORWZDCBxcTAUYjGSR&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240615/us-west-2/s3/aws4_request&X-Amz-Date=20240615T131558Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=1bd9883e3ae6b87580d45b17c6844b551cf9c91bd1e33ef7830b1b252bd6a030&itok=C2VAPTEC)
![136383-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/136383-Thumbnail%20Image.png?versionId=8mw7mU5tU5BCOrdQduIshTBHa4NeaKC.&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240530/us-west-2/s3/aws4_request&X-Amz-Date=20240530T153639Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=808c395d038204045b4b9e4a91fd50388a9989b2ab3eddbe50555a38eceebf76&itok=8GIPQD6d)
Human activity recognition is the task of identifying a person’s movement from sensors in a wearable device, such as a smartphone, smartwatch, or a medical-grade device. A great method for this task is machine learning, which is the study of algorithms that learn and improve on their own with the help of massive amounts of useful data. These classification models can accurately classify activities with the time-series data from accelerometers and gyroscopes. A significant way to improve the accuracy of these machine learning models is preprocessing the data, essentially augmenting data to make the identification of each activity, or class, easier for the model. <br/>On this topic, this paper explains the design of SigNorm, a new web application which lets users conveniently transform time-series data and view the effects of those transformations in a code-free, browser-based user interface. The second and final section explains my take on a human activity recognition problem, which involves comparing a preprocessed dataset to an un-augmented one, and comparing the differences in accuracy using a one-dimensional convolutional neural network to make classifications.
![148500-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-07/148500-Thumbnail%20Image.png?versionId=125A9Rnxu_gA2DecgxX1e2XCzJ1IsdWR&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240615/us-west-2/s3/aws4_request&X-Amz-Date=20240615T065555Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=9a54e65f3e0b0435d5c5e4bc91d53da83380c984232565fd3cc866888027fabb&itok=DVCXRfhn)
As life expectancy increases worldwide, age related diseases are becoming greater health concerns. One of the most prevalent age-related diseases in the United States is dementia, with Alzheimer’s disease (AD) being the most common form, accounting for 60-80% of cases. Genetics plays a large role in a person’s risk of developing AD. Familial AD, which makes up less than 1% of all AD cases, is caused by autosomal dominant gene mutations and has almost 100% penetrance. Genetic risk factors are believed to make up about 49%-79% of the risk in sporadic cases. Many different genetic risk factors for both familial and sporadic AD have been identified, but there is still much work to be done in the field of AD, especially in non-Caucasian populations. This review summarizes the three major genes responsible for familial AD, namely APP, PSEN1 and PSEN2. Also discussed are seven identified genetic risk factors for sporadic AD, single nucleotide polymorphisms in the APOE, ABCA7, NEDD9, CASS4, PTK2B, CLU, and PICALM genes. An overview of the main function of the proteins associated with the genes is given, along with the supposed connection to AD pathology.
![135927-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/135927-Thumbnail%20Image.png?versionId=4xu8DkdRYgz0.lak7_K5Xf.qIbw31per&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T150719Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=13508bc59c91e4095bccb318fbbe83060aa6d77e06418d375dd2e7522426d587&itok=1VUIs_uX)
![130366-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-04/130366-Thumbnail%20Image.png?versionId=Y3UFsAqEPFLGR1T0ZQLJRZ.sD6BFOtMt&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240605/us-west-2/s3/aws4_request&X-Amz-Date=20240605T194838Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=718c194d50060ecbf7fd1c1a1829bc11b4c67ad11f1dacf98718f9a82e48c8c2&itok=6MENkW_C)
The purpose of this study is to determine the feasibility of three widely used wearable sensors in research settings for 24 h monitoring of sleep, sedentary, and active behaviors in middle-aged women.
Methods
Participants were 21 inactive, overweight (M Body Mass Index (BMI) = 29.27 ± 7.43) women, 30 to 64 years (M = 45.31 ± 9.67). Women were instructed to wear each sensor on the non-dominant hip (ActiGraph GT3X+), wrist (GENEActiv), or upper arm (BodyMedia SenseWear Mini) for 24 h/day and record daily wake and bed times for one week over the course of three consecutive weeks. Women received feedback about their daily physical activity and sleep behaviors. Feasibility (i.e., acceptability and demand) was measured using surveys, interviews, and wear time.
Results
Women felt the GENEActiv (94.7 %) and SenseWear Mini (90.0 %) were easier to wear and preferred the placement (68.4, 80 % respectively) as compared to the ActiGraph (42.9, 47.6 % respectively). Mean wear time on valid days was similar across sensors (ActiGraph: M = 918.8 ± 115.0 min; GENEActiv: M = 949.3 ± 86.6; SenseWear: M = 928.0 ± 101.8) and well above other studies using wake time only protocols. Informational feedback was the biggest motivator, while appearance, comfort, and inconvenience were the biggest barriers to wearing sensors. Wear time was valid on 93.9 % (ActiGraph), 100 % (GENEActiv), and 95.2 % (SenseWear) of eligible days. 61.9, 95.2, and 71.4 % of participants had seven valid days of data for the ActiGraph, GENEActiv, and SenseWear, respectively.
Conclusion
Twenty-four hour monitoring over seven consecutive days is a feasible approach in middle-aged women. Researchers should consider participant acceptability and demand, in addition to validity and reliability, when choosing a wearable sensor. More research is needed across populations and study designs.