Matching Items (90)
Filtering by

Clear all filters

148033-Thumbnail Image.png
Description

Every communication system has a receiver and a transmitter. Irrespective if it is wired or wireless.The future of wireless communication consists of a massive number of transmitters and receivers. The question arises, can we use computer vision to help wireless communication? To satisfy the high data requirement, a large number

Every communication system has a receiver and a transmitter. Irrespective if it is wired or wireless.The future of wireless communication consists of a massive number of transmitters and receivers. The question arises, can we use computer vision to help wireless communication? To satisfy the high data requirement, a large number of antennas are required. The devices that employ large-antenna arrays have other sensors such as RGB camera, depth camera, or LiDAR sensors.These vision sensors help us overcome the non-trivial wireless communication challenges, such as beam blockage prediction and hand-over prediction.This is further motivated by the recent advances in deep learning and computer vision that can extract high-level semantics from complex visual scenes, and the increasing interest of leveraging machine/deep learning tools in wireless communication problems.[1] <br/><br/>The research was focused solely based on technology like 3D cameras,object detection and object tracking using Computer vision and compression techniques. The main objective of using computer vision was to make Milli-meter Wave communication more robust, and to collect more data for the machine learning algorithms. Pre-build lossless and lossy compression algorithms, such as FFMPEG, were used in the research. An algorithm was developed that could use 3D cameras and machine learning models such as YOLOV3, to track moving objects using servo motors and low powered computers like the raspberry pi or the Jetson Nano. In other words, the receiver could track the highly mobile transmitter in 1 dimension using a 3D camera. Not only that, during the research, the transmitter was loaded on a DJI M600 pro drone, and then machine learning and object tracking was used to track the highly mobile drone. In order to build this machine learning model and object tracker, collecting data like depth, RGB images and position coordinates were the first yet the most important step. GPS coordinates from the DJI M600 were also pulled and were successfully plotted on google earth. This proved to be very useful during data collection using a drone and for the future applications of position estimation for a drone using machine learning. <br/><br/>Initially, images were taken from transmitter camera every second,and those frames were then converted to a text file containing hex-decimal values. Each text file was then transmitted from the transmitter to receiver, and on the receiver side, a python code converted the hex-decimal to JPG. This would give an efect of real time video transmission. However, towards the end of the research, an industry standard, real time video was streamed using pre-built FFMPEG modules, GNU radio and Universal Software Radio Peripheral (USRP). The transmitter camera was a PI-camera. More details will be discussed as we further dive deep into this research report.

ContributorsSeth, Madhav (Author) / Alkhateeb, Ahmed (Thesis director) / Alrabeiah, Muhammad (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2021-05
150231-Thumbnail Image.png
Description
In this thesis I introduce a new direction to computing using nonlinear chaotic dynamics. The main idea is rich dynamics of a chaotic system enables us to (1) build better computers that have a flexible instruction set, and (2) carry out computation that conventional computers are not good at it.

In this thesis I introduce a new direction to computing using nonlinear chaotic dynamics. The main idea is rich dynamics of a chaotic system enables us to (1) build better computers that have a flexible instruction set, and (2) carry out computation that conventional computers are not good at it. Here I start from the theory, explaining how one can build a computing logic block using a chaotic system, and then I introduce a new theoretical analysis for chaos computing. Specifically, I demonstrate how unstable periodic orbits and a model based on them explains and predicts how and how well a chaotic system can do computation. Furthermore, since unstable periodic orbits and their stability measures in terms of eigenvalues are extractable from experimental times series, I develop a time series technique for modeling and predicting chaos computing from a given time series of a chaotic system. After building a theoretical framework for chaos computing I proceed to architecture of these chaos-computing blocks to build a sophisticated computing system out of them. I describe how one can arrange and organize these chaos-based blocks to build a computer. I propose a brand new computer architecture using chaos computing, which shifts the limits of conventional computers by introducing flexible instruction set. Our new chaos based computer has a flexible instruction set, meaning that the user can load its desired instruction set to the computer to reconfigure the computer to be an implementation for the desired instruction set. Apart from direct application of chaos theory in generic computation, the application of chaos theory to speech processing is explained and a novel application for chaos theory in speech coding and synthesizing is introduced. More specifically it is demonstrated how a chaotic system can model the natural turbulent flow of the air in the human speech production system and how chaotic orbits can be used to excite a vocal tract model. Also as another approach to build computing system based on nonlinear system, the idea of Logical Stochastic Resonance is studied and adapted to an autoregulatory gene network in the bacteriophage λ.
ContributorsKia, Behnam (Author) / Ditto, William (Thesis advisor) / Huang, Liang (Committee member) / Lai, Ying-Cheng (Committee member) / Helms Tillery, Stephen (Committee member) / Arizona State University (Publisher)
Created2011
150551-Thumbnail Image.png
Description
Complex dynamical systems consisting interacting dynamical units are ubiquitous in nature and society. Predicting and reconstructing nonlinear dynamics of units and the complex interacting networks among them serves the base for the understanding of a variety of collective dynamical phenomena. I present a general method to address the two outstanding

Complex dynamical systems consisting interacting dynamical units are ubiquitous in nature and society. Predicting and reconstructing nonlinear dynamics of units and the complex interacting networks among them serves the base for the understanding of a variety of collective dynamical phenomena. I present a general method to address the two outstanding problems as a whole based solely on time-series measurements. The method is implemented by incorporating compressive sensing approach that enables an accurate reconstruction of complex dynamical systems in terms of both nodal equations that determines the self-dynamics of units and detailed coupling patterns among units. The representative advantages of the approach are (i) the sparse data requirement which allows for a successful reconstruction from limited measurements, and (ii) general applicability to identical and nonidentical nodal dynamics, and to networks with arbitrary interacting structure, strength and sizes. Another two challenging problem of significant interest in nonlinear dynamics: (i) predicting catastrophes in nonlinear dynamical systems in advance of their occurrences and (ii) predicting the future state for time-varying nonlinear dynamical systems, can be formulated and solved in the framework of compressive sensing using only limited measurements. Once the network structure can be inferred, the dynamics behavior on them can be investigated, for example optimize information spreading dynamics, suppress cascading dynamics and traffic congestion, enhance synchronization, game dynamics, etc. The results can yield insights to control strategies design in the real-world social and natural systems. Since 2004, there has been a tremendous amount of interest in graphene. The most amazing feature of graphene is that there exists linear energy-momentum relationship when energy is low. The quasi-particles inside the system can be treated as chiral, massless Dirac fermions obeying relativistic quantum mechanics. Therefore, the graphene provides one perfect test bed to investigate relativistic quantum phenomena, such as relativistic quantum chaotic scattering and abnormal electron paths induced by klein tunneling. This phenomenon has profound implications to the development of graphene based devices that require stable electronic properties.
ContributorsYang, Rui (Author) / Lai, Ying-Cheng (Thesis advisor) / Duman, Tolga M. (Committee member) / Akis, Richard (Committee member) / Huang, Liang (Committee member) / Arizona State University (Publisher)
Created2012
151230-Thumbnail Image.png
Description
What can classical chaos do to quantum systems is a fundamental issue highly relevant to a number of branches in physics. The field of quantum chaos has been active for three decades, where the focus was on non-relativistic quantumsystems described by the Schr¨odinger equation. By developing an efficient method to

What can classical chaos do to quantum systems is a fundamental issue highly relevant to a number of branches in physics. The field of quantum chaos has been active for three decades, where the focus was on non-relativistic quantumsystems described by the Schr¨odinger equation. By developing an efficient method to solve the Dirac equation in the setting where relativistic particles can tunnel between two symmetric cavities through a potential barrier, chaotic cavities are found to suppress the spread in the tunneling rate. Tunneling rate for any given energy assumes a wide range that increases with the energy for integrable classical dynamics. However, for chaotic underlying dynamics, the spread is greatly reduced. A remarkable feature, which is a consequence of Klein tunneling, arise only in relativistc quantum systems that substantial tunneling exists even for particle energy approaching zero. Similar results are found in graphene tunneling devices, implying high relevance of relativistic quantum chaos to the development of such devices. Wave propagation through random media occurs in many physical systems, where interesting phenomena such as branched, fracal-like wave patterns can arise. The generic origin of these wave structures is currently a matter of active debate. It is of fundamental interest to develop a minimal, paradigmaticmodel that can generate robust branched wave structures. In so doing, a general observation in all situations where branched structures emerge is non-Gaussian statistics of wave intensity with an algebraic tail in the probability density function. Thus, a universal algebraic wave-intensity distribution becomes the criterion for the validity of any minimal model of branched wave patterns. Coexistence of competing species in spatially extended ecosystems is key to biodiversity in nature. Understanding the dynamical mechanisms of coexistence is a fundamental problem of continuous interest not only in evolutionary biology but also in nonlinear science. A continuous model is proposed for cyclically competing species and the effect of the interplay between the interaction range and mobility on coexistence is investigated. A transition from coexistence to extinction is uncovered with a non-monotonic behavior in the coexistence probability and switches between spiral and plane-wave patterns arise. Strong mobility can either promote or hamper coexistence, while absent in lattice-based models, can be explained in terms of nonlinear partial differential equations.
ContributorsNi, Xuan (Author) / Lai, Ying-Cheng (Thesis advisor) / Huang, Liang (Committee member) / Yu, Hongbin (Committee member) / Akis, Richard (Committee member) / Arizona State University (Publisher)
Created2012
151177-Thumbnail Image.png
Description
Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of

Single cell analysis has become increasingly important in understanding disease onset, progression, treatment and prognosis, especially when applied to cancer where cellular responses are highly heterogeneous. Through the advent of single cell computerized tomography (Cell-CT), researchers and clinicians now have the ability to obtain high resolution three-dimensional (3D) reconstructions of single cells. Yet to date, no live-cell compatible version of the technology exists. In this thesis, a microfluidic chip with the ability to rotate live single cells in hydrodynamic microvortices about an axis parallel to the optical focal plane has been demonstrated. The chip utilizes a novel 3D microchamber design arranged beneath a main channel creating flow detachment into the chamber, producing recirculating flow conditions. Single cells are flowed through the main channel, held in the center of the microvortex by an optical trap, and rotated by the forces induced by the recirculating fluid flow. Computational fluid dynamics (CFD) was employed to optimize the geometry of the microchamber. Two methods for the fabrication of the 3D microchamber were devised: anisotropic etching of silicon and backside diffuser photolithography (BDPL). First, the optimization of the silicon etching conditions was demonstrated through design of experiment (DOE). In addition, a non-conventional method of soft-lithography was demonstrated which incorporates the use of two positive molds, one of the main channel and the other of the microchambers, compressed together during replication to produce a single ultra-thin (<200 µm) negative used for device assembly. Second, methods for using thick negative photoresists such as SU-8 with BDPL have been developed which include a new simple and effective method for promoting the adhesion of SU-8 to glass. An assembly method that bonds two individual ultra-thin (<100 µm) replications of the channel and the microfeatures has also been demonstrated. Finally, a pressure driven pumping system with nanoliter per minute flow rate regulation, sub-second response times, and < 3% flow variability has been designed and characterized. The fabrication and assembly of this device is inexpensive and utilizes simple variants of conventional microfluidic fabrication techniques, making it easily accessible to the single cell analysis community.
ContributorsMyers, Jakrey R (Author) / Meldrum, Deirdre (Thesis advisor) / Johnson, Roger (Committee member) / Frakes, David (Committee member) / Arizona State University (Publisher)
Created2012
130342-Thumbnail Image.png
Description
Background
Grading schemes for breast cancer diagnosis are predominantly based on pathologists' qualitative assessment of altered nuclear structure from 2D brightfield microscopy images. However, cells are three-dimensional (3D) objects with features that are inherently 3D and thus poorly characterized in 2D. Our goal is to quantitatively characterize nuclear structure in 3D,

Background
Grading schemes for breast cancer diagnosis are predominantly based on pathologists' qualitative assessment of altered nuclear structure from 2D brightfield microscopy images. However, cells are three-dimensional (3D) objects with features that are inherently 3D and thus poorly characterized in 2D. Our goal is to quantitatively characterize nuclear structure in 3D, assess its variation with malignancy, and investigate whether such variation correlates with standard nuclear grading criteria.
Methodology
We applied micro-optical computed tomographic imaging and automated 3D nuclear morphometry to quantify and compare morphological variations between human cell lines derived from normal, benign fibrocystic or malignant breast epithelium. To reproduce the appearance and contrast in clinical cytopathology images, we stained cells with hematoxylin and eosin and obtained 3D images of 150 individual stained cells of each cell type at sub-micron, isotropic resolution. Applying volumetric image analyses, we computed 42 3D morphological and textural descriptors of cellular and nuclear structure.
Principal Findings
We observed four distinct nuclear shape categories, the predominant being a mushroom cap shape. Cell and nuclear volumes increased from normal to fibrocystic to metastatic type, but there was little difference in the volume ratio of nucleus to cytoplasm (N/C ratio) between the lines. Abnormal cell nuclei had more nucleoli, markedly higher density and clumpier chromatin organization compared to normal. Nuclei of non-tumorigenic, fibrocystic cells exhibited larger textural variations than metastatic cell nuclei. At p<0.0025 by ANOVA and Kruskal-Wallis tests, 90% of our computed descriptors statistically differentiated control from abnormal cell populations, but only 69% of these features statistically differentiated the fibrocystic from the metastatic cell populations.
Conclusions
Our results provide a new perspective on nuclear structure variations associated with malignancy and point to the value of automated quantitative 3D nuclear morphometry as an objective tool to enable development of sensitive and specific nuclear grade classification in breast cancer diagnosis.
Created2012-01-05
132010-Thumbnail Image.png
Description
Complex human controls is a topic of much interest in the fields of robotics, manufacturing, space exploration and many others. Even simple tasks that humans perform with ease can be extremely complicated when observed from a controls and complex systems perspective. One such simple task is that of a human

Complex human controls is a topic of much interest in the fields of robotics, manufacturing, space exploration and many others. Even simple tasks that humans perform with ease can be extremely complicated when observed from a controls and complex systems perspective. One such simple task is that of a human carrying and moving a coffee cup. Though this may be a mundane task for humans, when this task is modelled and analyzed, the system may be quite chaotic in nature. Understanding such systems is key to the development robots and autonomous systems that can perform these tasks themselves.

The coffee cup system can be simplified and modeled by a cart-and-pendulum system. Bazzi et al. and Maurice et al. present two different cart-and-pendulum systems to represent the coffee cup system [1],[2]. The purpose of this project was to build upon these systems and to gain a better understanding of the coffee cup system and to determine where chaos existed within the system. The honors thesis team first worked with their senior design group to develop a mathematical model for the cart-and-pendulum system based on the Bazzi and Maurice papers [1],[2]. This system was analyzed and then built upon by the honors thesis team to build a cart-and-two-pendulum model to represent the coffee cup system more accurately.

Analysis of the single pendulum model showed that there exists a low frequency region where the pendulum and the cart remain in phase with each other and a high frequency region where the cart and pendulum have a π phase difference between them. The transition point of the low and high frequency region is determined by the resonant frequency of the pendulum. The analysis of the two-pendulum system also confirmed this result and revealed that differences in length between the pendulum cause the pendulums to transition to the high frequency regions at separate frequency. The pendulums have different resonance frequencies and transition into the high frequency region based on their own resonant frequency. This causes a range of frequencies where the pendulums are out of phase from each other. After both pendulums have transitioned, they remain in phase with each other and out of phase from the cart.

However, if the length of the pendulum is decreased too much, the system starts to exhibit chaotic behavior. The short pendulum starts to act in a chaotic manner and the phase relationship between the pendulums and the carts is no longer maintained. Since the pendulum length represents the distance between the particle of coffee and the top of the cup, this implies that coffee near the top of the cup would cause the system to act chaotically. Further analysis would be needed to determine the reason why the length affects the system in this way.
ContributorsZindani, Abdul Rahman (Co-author) / Crane, Kari (Co-author) / Lai, Ying-Cheng (Thesis director) / Jiang, Junjie (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2019-12
131527-Thumbnail Image.png
Description
Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic

Object localization is used to determine the location of a device, an important aspect of applications ranging from autonomous driving to augmented reality. Commonly-used localization techniques include global positioning systems (GPS), simultaneous localization and mapping (SLAM), and positional tracking, but all of these methodologies have drawbacks, especially in high traffic indoor or urban environments. Using recent improvements in the field of machine learning, this project proposes a new method of localization using networks with several wireless transceivers and implemented without heavy computational loads or high costs. This project aims to build a proof-of-concept prototype and demonstrate that the proposed technique is feasible and accurate.

Modern communication networks heavily depend upon an estimate of the communication channel, which represents the distortions that a transmitted signal takes as it moves towards a receiver. A channel can become quite complicated due to signal reflections, delays, and other undesirable effects and, as a result, varies significantly with each different location. This localization system seeks to take advantage of this distinctness by feeding channel information into a machine learning algorithm, which will be trained to associate channels with their respective locations. A device in need of localization would then only need to calculate a channel estimate and pose it to this algorithm to obtain its location.

As an additional step, the effect of location noise is investigated in this report. Once the localization system described above demonstrates promising results, the team demonstrates that the system is robust to noise on its location labels. In doing so, the team demonstrates that this system could be implemented in a continued learning environment, in which some user agents report their estimated (noisy) location over a wireless communication network, such that the model can be implemented in an environment without extensive data collection prior to release.
ContributorsChang, Roger (Co-author) / Kann, Trevor (Co-author) / Alkhateeb, Ahmed (Thesis director) / Bliss, Daniel (Committee member) / Electrical Engineering Program (Contributor) / Barrett, The Honors College (Contributor)
Created2020-05
171482-Thumbnail Image.png
Description
The recent trends in wireless communication, fueled by the demand for lower latency and higher bandwidth, have caused the migration of users from lower frequencies to higher frequencies, i.e., from 2.5GHz to millimeter wave. However, the migration to higher frequencies has its challenges. The sensitivity to blockages is a key

The recent trends in wireless communication, fueled by the demand for lower latency and higher bandwidth, have caused the migration of users from lower frequencies to higher frequencies, i.e., from 2.5GHz to millimeter wave. However, the migration to higher frequencies has its challenges. The sensitivity to blockages is a key challenge for millimeter wave and terahertz networks in 5G and beyond. Since these networks mainly rely on line-of-sight (LOS) links, sudden link blockages highly threaten the reliability of such networks. Further, when the LOS link is blocked, the network typically needs to hand off the user to another LOS basestation, which may incur critical time latency, especially if a search over a large codebook of narrow beams is needed. A promising way to tackle the reliability and latency challenges lies in enabling proaction in wireless networks. Proaction allows the network to anticipate future blockages, especially dynamic blockages, and initiate user hand-off beforehand. This thesis presents a complete machine learning framework for enabling proaction in wireless networks relying on the multi-modal 3D LiDAR(Light Detection and Ranging) point cloud and position data. In particular, the paper proposes a sensing-aided wireless communication solution that utilizes bimodal machine learning to predict the user link status. This is mainly achieved via a deep learning algorithm that learns from LiDAR point-cloud and position data to distinguish between LOS and NLOS(non line-of-sight) links. The algorithm is evaluated on the multi-modal wireless Communication Dataset DeepSense6G dataset. It is a time-synchronized collection of data from various sensors such as millimeter wave power, position, camera, radar, and LiDAR. Experimental results indicate that the algorithm can accurately predict link status with 87% accuracy. This highlights a promising direction for enabling high reliability and low latency in future wireless networks.
ContributorsSrinivas, Tirumalai Vinjamoor Nikhil (Author) / Alkhateeb, Ahmed (Thesis advisor) / Trichopoulos, Georgios (Committee member) / Myhajlenko, Stefan (Committee member) / Arizona State University (Publisher)
Created2022
190918-Thumbnail Image.png
Description
Reconfigurable metasurfaces (RMSs) are promising solutions for beamforming and sensing applications including 5G and beyond wireless communications, satellite and radar systems, and biomarker sensing. In this work, three distinct RMS architectures – reconfigurable intelligent surfaces (RISs), meta-transmission lines (meta-TLs), and substrate integrated waveguide leaky-wave antennas (SIW-LWAs) are developed and characterized.

Reconfigurable metasurfaces (RMSs) are promising solutions for beamforming and sensing applications including 5G and beyond wireless communications, satellite and radar systems, and biomarker sensing. In this work, three distinct RMS architectures – reconfigurable intelligent surfaces (RISs), meta-transmission lines (meta-TLs), and substrate integrated waveguide leaky-wave antennas (SIW-LWAs) are developed and characterized. The ever-increasing demand for higher data rates and lower latencies has propelled the telecommunications industry to adopt higher frequencies for 5G and beyond wireless communications. However, this transition to higher frequencies introduces challenges in terms of signal coverage and path loss. Many base stations would be necessary to ensure signal fidelity in such a setting, making bulky phased array-based solutions impractical. Consequently, to meet the unique needs of 5G and beyond wireless communication networks, this work proposes the use of RISs characterized by low-profile, low-RF losses, low-power consumption, and high-gain capabilities, making them excellent candidates for future wireless communication applications. Specifically, RISs at sub-6GHz, mmWave and sub-THz frequencies are analyzed to demonstrate their ability to improve signal strength and coverage. Further, a linear meta-TL wave space is designed to achieve miniaturization of true-time delay beamforming structures such as Rotman lenses which are traditionally bulky. To address this challenge, a modified lumped element TL model is proposed. A meta-TL is created by including the mutual coupling effects and can be used to slow down the electromagnetic signal and realize miniaturized lenses. A proof-of-concept 1D meta-TL is developed to demonstrate about 90% size reduction and 40% bandwidth improvement. Furthermore, a conformable antenna design for radio frequency-based tracking of hand gestures is also detailed. SIW-LWA is employed as the radiating element to couple RF signals into the human hand. The antenna is envisaged to be integrated in a wristband topology and capture the changes in the electric field caused by various movements of the hand. The scattering parameters are used to track the changes in the wrist anatomy. Sensor characterization showed significant sensitivity suppression due to lossy multi-dielectric nature tissues in the wrist. However, the sensor demonstrates good coupling of electromagnetic energy making it suitable for on-body wireless communications and magnetic resonance imaging applications.
ContributorsKashyap, Bharath Gundappa (Author) / Trichopoulos, Georgios C (Thesis advisor) / Balanis, Constantine A (Committee member) / Aberle, James T (Committee member) / Alkhateeb, Ahmed (Committee member) / Imani, Seyedmohammedreza F (Committee member) / Arizona State University (Publisher)
Created2023