Matching Items (4)
Filtering by

Clear all filters

153041-Thumbnail Image.png
Description
A firewall is a necessary component for network security and just like any regular equipment it requires maintenance. To keep up with changing cyber security trends and threats, firewall rules are modified frequently. Over time such modifications increase the complexity, size and verbosity of firewall rules. As the rule set

A firewall is a necessary component for network security and just like any regular equipment it requires maintenance. To keep up with changing cyber security trends and threats, firewall rules are modified frequently. Over time such modifications increase the complexity, size and verbosity of firewall rules. As the rule set grows in size, adding and modifying rule becomes a tedious task. This discourages network administrators to review the work done by previous administrators before and after applying any changes. As a result the quality and efficiency of the firewall goes down.

Modification and addition of rules without knowledge of previous rules creates anomalies like shadowing and rule redundancy. Anomalous rule sets not only limit the efficiency of the firewall but in some cases create a hole in the perimeter security. Detection of anomalies has been studied for a long time and some well established procedures have been implemented and tested. But they all have a common problem of visualizing the results. When it comes to visualization of firewall anomalies, the results do not fit in traditional matrix, tree or sunburst representations.

This research targets the anomaly detection and visualization problem. It analyzes and represents firewall rule anomalies in innovative ways such as hive plots and dynamic slices. Such graphical representations of rule anomalies are useful in understanding the state of a firewall. It also helps network administrators in finding and fixing the anomalous rules.
ContributorsKhatkar, Pankaj Kumar (Author) / Huang, Dijiang (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Syrotiuk, Violet R. (Committee member) / Arizona State University (Publisher)
Created2014
149930-Thumbnail Image.png
Description
Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst.

Concern regarding the quality of traffic data exists among engineers and planners tasked with obtaining and using the data for various transportation applications. While data quality issues are often understood by analysts doing the hands on work, rarely are the quality characteristics of the data effectively communicated beyond the analyst. This research is an exercise in measuring and reporting data quality. The assessment was conducted to support the performance measurement program at the Maricopa Association of Governments in Phoenix, Arizona, and investigates the traffic data from 228 continuous monitoring freeway sensors in the metropolitan region. Results of the assessment provide an example of describing the quality of the traffic data with each of six data quality measures suggested in the literature, which are accuracy, completeness, validity, timeliness, coverage and accessibility. An important contribution is made in the use of data quality visualization tools. These visualization tools are used in evaluating the validity of the traffic data beyond pass/fail criteria commonly used. More significantly, they serve to educate an intuitive sense or understanding of the underlying characteristics of the data considered valid. Recommendations from the experience gained in this assessment include that data quality visualization tools be developed and used in the processing and quality control of traffic data, and that these visualization tools, along with other information on the quality control effort, be stored as metadata with the processed data.
ContributorsSamuelson, Jothan P (Author) / Pendyala, Ram M. (Thesis advisor) / Ahn, Soyoung (Committee member) / Arizona State University (Publisher)
Created2011
154622-Thumbnail Image.png
Description
In traditional networks the control and data plane are highly coupled, hindering development. With Software Defined Networking (SDN), the two planes are separated, allowing innovations on either one independently of the other. Here, the control plane is formed by the applications that specify an organization's policy and the data plane

In traditional networks the control and data plane are highly coupled, hindering development. With Software Defined Networking (SDN), the two planes are separated, allowing innovations on either one independently of the other. Here, the control plane is formed by the applications that specify an organization's policy and the data plane contains the forwarding logic. The application sends all commands to an SDN controller which then performs the requested action on behalf of the application. Generally, the requested action is a modification to the flow tables, present in the switches, to reflect a change in the organization's policy. There are a number of ways to control the network using the SDN principles, but the most widely used approach is OpenFlow.

With the applications now having direct access to the flow table entries, it is easy to have inconsistencies arise in the flow table rules. Since the flow rules are structured similar to firewall rules, the research done in analyzing and identifying firewall rule conflicts can be adapted to work with OpenFlow rules.

The main work of this thesis is to implement flow conflict detection logic in OpenDaylight and inspect the applicability of techniques in visualizing the conflicts. A hierarchical edge-bundling technique coupled with a Reingold-Tilford tree is employed to present the relationship between the conflicting rules. Additionally, a table-driven approach is also implemented to display the details of each flow.

Both types of visualization are then tested for correctness by providing them with flows which are known to have conflicts. The conflicts were identified properly and displayed by the views.
ContributorsNatarajan, Janakarajan (Author) / Huang, Dijiang (Thesis advisor) / Syrotiuk, Violet R. (Thesis advisor) / Ahn, Gail-Joon (Committee member) / Arizona State University (Publisher)
Created2016
Description
Elizabeth Grumbach, the project manager of the Institute for Humanities Research's Digital Humanities Initiative, shares methodologies and best practices for designing a digital humanities project. The workshop will offer participants an introduction to digital humanities fundamentals, specifically tools and methodologies. Participants explore technologies and platforms that allow scholars of all

Elizabeth Grumbach, the project manager of the Institute for Humanities Research's Digital Humanities Initiative, shares methodologies and best practices for designing a digital humanities project. The workshop will offer participants an introduction to digital humanities fundamentals, specifically tools and methodologies. Participants explore technologies and platforms that allow scholars of all skills levels to engage with digital humanities methods. Participants will be introduced to a variety of tools (including mapping, visualization, data analytics, and multimedia digital publication platforms), and how and why to choose specific applications, platforms, and tools based on project needs.
ContributorsGrumbach, Elizabeth (Author)
Created2018-09-26