Central pit craters on Mars are another instance of intriguing structures that probe subsurface physical properties. These kilometer-scale pits are nested in the centers of many impact craters on Mars as well as on icy satellites. They are inferred to form in the presence of a water-ice rich substrate; however, the process(es) responsible for their formation is still debated. Previous models invoke origins by either explosive excavation of potentially water-bearing crustal material, or by subsurface drainage of meltwater and/or collapse. I assessed radial trends in grain size around central pits using thermal inertias calculated from Thermal Emission Imaging System (THEMIS) thermal infrared images. Average grain size decreases with radial distance from pit rims – consistent with pit-derived ejecta but not expected for collapse models. I present a melt-contact model that might enable a delayed explosion, in which a central uplift brings ice-bearing substrate into contact with impact melt to generate steam explosions and excavate central pits during the impact modification stage.
On the Moon, explosive volcanic deposits within Oppenheimer crater that were emplaced ballistically were investigated. Lunar Reconnaissance Orbiter (LRO) Diviner Radiometer mid-infrared data, LRO Camera images, and Chandrayaan-1 orbiter Moon Mineralogy Mapper near-infrared spectra were used to test the hypothesis that the pyroclastic deposits in Oppenheimer crater were emplaced via Vulcanian activity by constraining their composition and mineralogy. The mineralogy and iron-content of the pyroclastic deposits vary significantly (including examples of potentially very high iron compositions), which indicates variability in eruption style. These results suggest that localized lunar pyroclastic deposits may have a more complex origin and mode of emplacement than previously thought.
A number of emerging dynamic traffic analysis applications, such as regional or statewide traffic assignment, require a theoretically rigorous and computationally efficient model to describe the propagation and dissipation of system congestion with bottleneck capacity constraints. An open-source light-weight dynamic traffic assignment (DTA) package, namely DTALite, has been developed to allow a rapid utilization of advanced dynamic traffic analysis capabilities. This paper describes its three major modeling components: (1) a light-weight dynamic network loading simulator that embeds Newell’s simplified kinematic wave model; (2) a mesoscopic agent-based DTA procedure to incorporate driver’s heterogeneity; and (3) an integrated traffic assignment and origin–destination demand calibration system that can iteratively adjust path flow volume and distribution to match the observed traffic counts. A number of real-world test cases are described to demonstrate the effectiveness and performance of the proposed models under different network and data availability conditions.
Trip travel time reliability is an important measure of transportation system performance and a key factor affecting travelers’ choices. This paper explores a method for estimating travel time distributions for corridors that contain multiple bottlenecks. A set of analytical equations are used to calculate the number of queued vehicles ahead of a probe vehicle and further capture many important factors affecting travel times: the prevailing congestion level, queue discharge rates at the bottlenecks, and flow rates associated with merges and diverges. Based on multiple random scenarios and a vector of arrival times, the lane-by-lane delay at each bottleneck along the corridor is recursively estimated to produce a route-level travel time distribution. The model incorporates stochastic variations of bottleneck capacity and demand and explains the travel time correlations between sequential links. Its data needs are the entering and exiting flow rates and a sense of the lane-by-lane distribution of traffic at each bottleneck. A detailed vehicle trajectory data-set from the Next Generation SIMulation (NGSIM) project has been used to verify that the estimated distributions are valid, and the sources of estimation error are examined.