![137820-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137820-Thumbnail%20Image.png?versionId=DCVSDptbECliL5rROmTZjLGN_Uowra3_&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T210928Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=fc546aeff33cb63b041c64533dc78ea8fe2e160ea347f255810cf76ca43226af&itok=JFbZZa8-)
![137828-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137828-Thumbnail%20Image.png?versionId=NWG1uvK395TyqxU.fV0Qo0pwYwwfYDKL&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240615/us-west-2/s3/aws4_request&X-Amz-Date=20240615T040215Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=52d259baad394809d1c1d91dce43582ff38ea57278cda1d2569db88c8d49eda8&itok=iOZBlJHI)
![137835-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137835-Thumbnail%20Image.png?versionId=S5MJuCo3cX7pHDXozeMCF_9mwxjMyCjS&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240615/us-west-2/s3/aws4_request&X-Amz-Date=20240615T081201Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=459a62d3d267d3a785d6c2d546b0dba822424ed01cd1b5c173434cd2c45662d3&itok=utpOKAcc)
![137415-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137415-Thumbnail%20Image.png?versionId=tGs.FpBqYHKaEtKvhfx.9dylr12YVMN.&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T224800Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=4d7cb37aee36a0bd94a63eaa39cef17d38e5186bdc224fe45c7a52a63d7d1108&itok=mrhXOYkD)
![137475-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/137475-Thumbnail%20Image.png?versionId=CSA26FpCTxiJ._BF8N2KPrRLdlSCG3rv&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T233200Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=f45735d5b8817b16c3bbfe93099c7ca2f6c9ee5ebfdd426c2858a8d908ba6040&itok=Pi2VBAjW)
This research paper assesses the effectiveness of a remote garden-based learning curriculum in teaching elementary students’ basic systems thinking concepts. Five remote lessons were designed, covering different garden topics, and in order to integrate systems thinking concepts, the Systems Thinking Hierarchical Model was used. This model includes eight emergent characteristics of systems thinking necessary for developing systems thinking competency. Five students were given the remote garden-based learning lessons. Student work was evaluated for systems thinking understanding and student outcomes were compared to anticipated learning outcomes. Results suggest that elementary students are able to understand basic systems thinking concepts because student work met anticipated outcomes for four systems thinking characteristics and exceeded anticipated outcomes for one characteristic. These results are significant because they further confirm that elementary-aged students do have the ability to understand systems thinking and they contribute to a growing movement to integrate sustainability education into elementary curriculum.
![141468-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-06/141468-Thumbnail%20Image.png?versionId=Uswnd7ikEp1KgDARcL3iblJ1UJk2.xqa&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240605/us-west-2/s3/aws4_request&X-Amz-Date=20240605T151840Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=982ede38fb04655146012bdf209b6d8f10d4d5ff5481ba9aed9b01ee7afedc7b&itok=x5O3xTiu)
In this synthesis, we hope to accomplish two things: 1) reflect on how the analysis of the new archaeological cases presented in this special feature adds to previous case studies by revisiting a set of propositions reported in a 2006 special feature, and 2) reflect on four main ideas that are more specific to the archaeological cases: i) societal choices are influenced by robustness–vulnerability trade-offs, ii) there is interplay between robustness–vulnerability trade-offs and robustness–performance trade-offs, iii) societies often get locked in to particular strategies, and iv) multiple positive feedbacks escalate the perceived cost of societal change. We then discuss whether these lock-in traps can be prevented or whether the risks associated with them can be mitigated. We conclude by highlighting how these long-term historical studies can help us to understand current society, societal practices, and the nexus between ecology and society.
Human activity recognition is the task of identifying a person’s movement from sensors in a wearable device, such as a smartphone, smartwatch, or a medical-grade device. A great method for this task is machine learning, which is the study of algorithms that learn and improve on their own with the help of massive amounts of useful data. These classification models can accurately classify activities with the time-series data from accelerometers and gyroscopes. A significant way to improve the accuracy of these machine learning models is preprocessing the data, essentially augmenting data to make the identification of each activity, or class, easier for the model. <br/>On this topic, this paper explains the design of SigNorm, a new web application which lets users conveniently transform time-series data and view the effects of those transformations in a code-free, browser-based user interface. The second and final section explains my take on a human activity recognition problem, which involves comparing a preprocessed dataset to an un-augmented one, and comparing the differences in accuracy using a one-dimensional convolutional neural network to make classifications.
![135927-Thumbnail Image.png](https://d1rbsgppyrdqq4.cloudfront.net/s3fs-public/styles/width_400/public/2021-05/135927-Thumbnail%20Image.png?versionId=4xu8DkdRYgz0.lak7_K5Xf.qIbw31per&X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIASBVQ3ZQ42ZLA5CUJ/20240614/us-west-2/s3/aws4_request&X-Amz-Date=20240614T150719Z&X-Amz-SignedHeaders=host&X-Amz-Expires=120&X-Amz-Signature=13508bc59c91e4095bccb318fbbe83060aa6d77e06418d375dd2e7522426d587&itok=1VUIs_uX)