This significant change in trigger rate raises the question of whether or not the current hardware will be able to handle triggering effectively at 100 Hz. This section explores the hardware computing performance of the ARIANNA pilot station, which includes the waveform digitizers, the readout into the FPGA, the transmission from FPGA to the MBED for further processing, and the application of the deep learning filter. This deadtime is the total processing time of transferring and packaging the data, and running on-board analysis programs. Deadtime is acquired by measuring the time delay between reset pulses, which is sent by the MBED when it is ready to receive a new event. To confirm the simulation results for the fractional livetime as a function of noise trigger rate, the ARIANNA DAQ board is injected with signal-like voltage pulses periodically at a rate of 100 mHzand a high enough amplitude that always fulfills the trigger condition. At the same time, the trigger threshold is lowered after every trial to increase the noise trigger rate. For example, at a noise triggering rate of 0 Hz, 100% fractional livetime is expected since the injection rate of the periodic pulses is greater than the deadtime, allowing all injected pulses to be processed and saved to the SD card. In figure 11, garden grow bags the results of the experiment can be seen as the threshold is lowered, which in turn increases the noise rate and decreases the livetime.
The CNN reaches a rejection factor of 105 , but trigger rates are limited to 10 Hz or lower to keep instrumental deadtime under 10%. This result motivates the consideration for improvements in hardware, such as replacing the MBED with a faster processing device.The ARIANNA detector stations are simultaneously sensitive to cosmic rays that interact in the atmosphere and neutrinos; in addition to the downward facing antennas that are designed to detect neutrinos, the ARIANNA detector contains upward facing antennas to detect cosmic rays. Cosmic rays provide a calibration beam with similar radio frequency content and time variation as expected from neutrinos . Moreover, both cosmic rays and neutrinos have bipolar signals that are short compared to the response time of the antennas, and their signals are significantly distinct from thermal fluctuations. Therefore, they provide another opportunity to verify the behavior of the CNN on signals that should not be rejected as noise. Cosmic rays trigger a given station at a rate of only a few events per day, so it is important to keep as many cosmic-ray events as possible.To test the network, a set of cosmic-ray events from ARIANNA station 52, collected between November 2018 and March 2019, are used. For this data, the two of four coincidence logic trigger is set to a threshold of 4.4 SNR, and the event set has an expected purity of > 95%. Then the 100 input sample CNN trained on neutrino data and described in section 3 is used on these cosmic-ray data. This network classifies 102 out of 104 cosmic-ray events as signal, which is a greater value than neutrino efficiency in this study because the thresholds are slightly larger . A network trained only with neutrinos still identifies cosmic rays adequately. It is likely that an even better cosmic-ray efficiency can be obtained by including them in the training data for a more robust CNN filter.
Due to the low neutrino flux at ultra-high energies, the sensitivity of the detector to a flux of high-energy neutrinos is limited by statistics. Probing new parameter spaces is made possible by implementing deep learning techniques to increase the sensitivity of the ARIANNA detector. A small convolutional neural network was implemented on the ARIANNA MBED microcontroller to discriminate between thermal noise fluctuations and neutrino signal. It was shown that CNN filters were much more accurate and computationally faster than simple cross-correlation methods in distinguishing between thermal events and neutrino signals. Only one thermal event in every 105 thermal triggers was misidentified by the CNN, while 95% of the neutrino signal was correctly identified. Consequently, the trigger rate can be increased by five orders-of-magnitude while transmitting at an event rate of 0.3 mHz over the Iridium communication network. This results in an increase in neutrino sensitivity of 40% at 1018 eV and up to a factor of two at lower energies. The simulation study was verified by lab measurements that found an excellent agreement between the measured and simulated distributions of both neutrino and thermal noise events. Then, a group of measured cosmic rays were run through the 100 input sample CNN, and 102 out of 104 events were classified as signal. At the moment, the processing speed of the ARIANNA hardware is the limiting factor and restricts the low-level trigger rate to smaller than 10 Hz to avoid deadtime. In the future, several improvements to the ARIANNA hardware will be considered. First, the ARIANNA hardware that sends the digitized data from the FPGA to the microprocessor can be parallelized to decrease the time to readout an event, which would provide the opportunity to trigger the detector at even higher rates. Second, more capable computing on the ARIANNA station through improved electronics will be studied.
The current generation of ARIANNA hardware is now more than 10 years old, and many recent microcomputer systems offer more performance at comparable power consumption. An increase in computer speed allows more complex CNN architectures with an equivalent improvement in the trade off between neutrino signal efficiency and background rejection. The combination of these two changes would increase the sensitivity of the ARIANNA detector even if the communication transfer rates remained the same. However, the next generation of Iridium satellites, Iridium Next has been recently deployed. This system has the potential to increase the transfer rates by many orders-of-magnitude relative to the SBD message transfer system currently used by ARIANNA. Reliance on deep learning filters may lead to unwanted results when incoming data deviates from training data, so they must be carefully evaluated by laboratory and field studies. The lab measurements described in this paper are encouraging, and suggestive that the simulations describe reality. The next stage of confirmation studies follow a similar plan that was used to validate the simulation studies of the sensitivity of the ARIANNA detector. After modifying the software in the data acquisition system to include the CNN filter, we plan to use a variety of radio transmitters within a preexisting borehole drilled to a depth of 1700 m at the South Pole , to confirm signal efficiency at various thermal noise trigger rates, and compare the rate and physical properties of cosmic-ray events to data samples collected without the filter. In addition to the current hardware constraints, there are limitations on the simulation tools as well. This analysis was able to use simulated data to achieve 5 orders-of-magnitude thermal noise rejection, but the simulations do not yet include a number of real world effects that can create small corrections to the data. One example is in the ice model, which does not include the layering structure of the ice density when signals propagate through it. The incompleteness of the simulations can affect the in-field performance. However, these are known limitations that will be explored in the next round of simulations.Modern agriculture faces environmental concerns about the use of pesticides. Organic agriculture is an alternative production method that limits the use of synthetic pesticides and fertilizers. The literature has documented that organic crop production does has a lower environmental impact per unit of land than conventional agriculture . However, tomato grow bags previous studies often concentrate on a small geographic or crop variety scoop. In essay 1, I use the California Pesticide Use Report database to examine the environmental impacts in conventional and organic crop production at a full scale. It includes all pesticide use in commercial production. I examine the period 1995 to 2015 and find that pesticides used in organic production had smaller negative environmental impacts on surface water, groundwater, soil, air, and pollinators than pesticides used in conventional production. Over time, this difference has declined. I also investigate how farm size and farming experience are correlated with pesticide use. I find that farmers with more acreage use pesticides that have larger environmental impacts. Specifically, more experienced farmers use pesticides that have greater impact on surface water and groundwater, and less impact on soil, air, and pollinators. The environmental impacts of pesticide use in organic agriculture increased over my study period, which is an interesting observations that requires further investigation. In essay 2, I focus on organic crop production and try to quantify the change in pesticide use. I find that the pesticide portfolio has changed dramatically for organic crop growers, as illustrated by the decline in sulfur use and the increase in spinosad use. Pesticide use is correlated with farm size. The consolidation of organic cropland is another trend documented in essay 2. Historically, consolidation in agriculture as a whole has manifested as an decrease in the number of farms while the total cropland remains stable . In the organic sector, in contrast, both the number of farms and acreage have grown significantly for the last two decades.
Nonetheless, consolidation has occurred because the share of large farms in total acreage had increased. In 2015, 56% of organic cropland was operated by growers with at least 500 acres of organic cropland, up from 15% in 1995. At the other end of the spectrum, growers with 10-50 acres accounted for 18% of organic cropland in 1995, which dropped to 8% in 2015. The average organic farm size increased from 46 acres in 1995 to 103 acres in 2015. The median organic farm size increased from 15 to 17 from 1995 to 2015. Farms with larger organic acreage, holding other variables constant, applied sulfur and fixed copper pesticides more frequently than those with smaller acreage. As a result, they had greater impacts on surface water and smaller impacts on soil and air because those ingredients are more toxic to fish and algae, and less toxic to earthworms and have lower Volatile Organic Compound emissions than other ingredients used in organic fields. The composition of organic crop has changed in California with the acreage share of vegetables increasing from 30% in 1995 to 50% in 2015. However, pesticide use patterns and the correlation with farm size do not differ between vegetables and other crops. The consolidation of cropland has not been limited to the organic sector. MacDonald et al. documented that the consolidation of acreage and value of production into a smaller number of larger operations has characterized U.S. agriculture for decades. In essay 3, I adapt and extent the endogenous growth model introduced in Lucas to explain changes in the size distribution of farms and specialization over time. In the theoretical model, farmers have knowledge regarding the production of each crop, and this knowledge grows only through learning from other farmers. Increased knowledge increases the profitability and knowledge can be apply across crops to various degrees. In my modeling framework, the opportunity cost of producing crops that farmers know less about increases as specialized knowledge accumulates, which reduces the number of crops produced by each farmer. The evolution of the farm size distribution in equilibrium and simulation results are presented to demonstrate how model parameters including learning rate, budget share, and elasticity of substitution alter the distribution of farm size and specialization.The food system has faced concerns about its use of pesticides since even before Rachel Carson published Silent Spring . Today, concerns about environmental impacts from pesticide applications continue to grow . In this context, organic agriculture is proposed as an alternative farming system as it prohibits the use of most synthetic substances . With strict modeling assumptions, Muller et al. presents sim-ulation results that support organic agriculture as an alternative production system capable of providing food for the world population by 2050. Consumers’ perception that organic agriculture is more environmentally friendly has facilitated its growth . According to the Organic Trade Association, U.S. organic food commodity sales reached $39 billion in 2015 in real terms, up from $4 billion in 1997, the base year. The share of organic food sales in total food commodity sales increased from less than 1% to 5% during the same time period . In 2002, the National Organic Program was launched.