Abstract

With the continuous growth of populations and expansions in developing countries, the availability of sufficient water resources is approaching a critical state, especially in arid and semi-arid lands. In Egypt, although the Nile has been sufficient for many centuries now, its dependability for all life applications for the coming decades is in question. By far, agricultural practices consume the greatest portion of fresh water from the Nile. As a result, there is a growing effort dedicated to investigating the use of treated wastewater for irrigation instead of using virgin fresh water as a best-sustainable practice. When it comes to the use of treated wastewater in agriculture, the contamination of highest concern is microbiological (bacteria such as E-coli, viruses, protozoa, and fungi). Not only does the direct application (i.e. without treatment) of wastewater before application pose great risks on the health of workers and the local community involved, but it also poses a high risk of contamination of the groundwater and the harvested crops. However, to what extent the wastewater should be treated before irrigation is the question that needs to be properly answered for the relevant site-specific conditions: while under-treatment renders the water unsafe, over-treatment can be costly and economically impractical. This study is a small part of a larger investigation that seeks to inform the development of guidelines for the sustainable use of treated wastewater in agriculture based on microbial contamination (using E-coli as an indicator) in a host environment representative of arid and semi-arid environments (sandy desert soil and desert outdoor conditions). The extent and rate of growth of microbes as well as their decay rates is greatly affected by the host environment, which in such a case is the soil media properties (such as the amount of organic content in the soil) and the temperature and exposure to sunlight. To accomplish this, bacteria survival experiments were conducted in static soil column tests set up in the laboratory before exposure to outdoor conditions. The bacterial growth was studied for three different initial buffer concentrations repeated in the summer and the winter for soil with three different organic fractions (0.035%, 0.3%, and 0.5% respectively). Samples were then taken at different time frames throughout each experiment, which in most cases lasted for a week. The study showed that in most cases, the total bacterial cells would reach their peak value within one day (24 hours). The extent of growth as well as the rate of growth and decay was considerably dependent upon the soil organic fraction and the temperature. At lower temperatures, the growth of the bacterial cells was observed to increase up to three orders of magnitude their initial value, and they were also observed to have more prolonged survival and slower inactivation rates. During the summer, on the other hand, the higher temperatures often promoted a more rapid die-off rate due to more intense solar radiation, decrease in moisture, and faster decomposition rate of soil nutrients. The concentration profile within a column was often observed to vary more during summer than winter experiments. A strong correlation was observed between bacterial growth and survival and the organic fraction of the soil. This was noticed in the change in the relative total cells of the bacteria in the soil column, where the highest peaks occurred at higher organic fractions. The increase in the organic content of the soil also tended to prolong the time of survival of the bacteria in soil even at high temperature. As anticipated, the extent of E. coli growth in the test soil was directly proportional to the concentration of cells in the solution added to the soil columns. The results of this study should aid in the development of sustainable practices for the cultivation of the deserts using treated wastewater in order to minimize risks to human health and the environment in addition to providing data to calculate those risks. The results should also aid in determining more realistic guidelines for acceptable levels of pathogens in treated wastewater to be used in desert reclamation projects in arid regions like the Middle East because they account for site-specific variables unique to these environments.

Department

Environmental Engineering Program

Degree Name

MS in Environmental Engineering

Graduation Date

2-1-2012

Submission Date

January 2012

First Advisor

Smith, Edward

Extent

NA

Document Type

Master's Thesis

Library of Congress Subject Heading 1

Water-supply -- Egypt -- Magement.

Library of Congress Subject Heading 2

Sustaible development -- Egypt.

Rights

The author retains all rights with regard to copyright. The author certifies that written permission from the owner(s) of third-party copyrighted matter included in the thesis, dissertation, paper, or record of study has been obtained. The author further certifies that IRB approval has been obtained for this thesis, or that IRB approval is not necessary for this thesis. Insofar as this thesis, dissertation, paper, or record of study is an educational record as defined in the Family Educational Rights and Privacy Act (FERPA) (20 USC 1232g), the author has granted consent to disclosure of it to anyone who requests a copy.

Institutional Review Board (IRB) Approval

Not necessary for this item

Comments

Without the support, help, and guidance of Professor Edward Smith, my supervisor, and my very patient and tolerant advisor, I would not have been able to complete or present this work. Dr. Smith, I would like to give you my eterl and unfailing gratitude for everything you have done for me and for your patience with me. Also, to Engr. Ahmed Saad, without whose help and support during my experimental work, thank you for everything. Without your help to me, I would not have been able to make it this far.

Share

COinS