Search




Search Results for:  water

What is the difference between the Computing for Sustainable Water project and the Computing for Clean Water project?

The Computing for Sustainable Water project is studying how changes in human activities could help improve the quality of watersheds, critical for sustaining life and food sources. The Computing for Clean Water project is trying to develop less expensive water filters so that it would be more practical to produce clean drinking water from poor water sources.

Display similar help items

Why is clean water important?

In some parts of the world, people take clean water for granted. But in much of the developing world, water is a scarce and often unavailable commodity. In the year 2000, about 8% of the world’s population lived in countries chronically short of water. But by 2050 it is estimated that this number will rise to 45% of the world’s population – by then the equivalent of 4 billion people.

Even where water is available, if it is not clean, it can become one of the biggest factors in spreading debilitating and often fatal diseases. Millions of people, including an estimated 1.4 million children, die annually most often from diarrhea, caused by drinking unsafe water.

Display similar help items

How does ultrafiltration work?

Ultrafiltration refers to the process of reducing or eliminating very small particles from water by passing water under very high pressure through a membrane containing very fine pores. The unwanted particles have a harder time getting through the membrane than the water molecules, so fewer of them appear on the other side. The high pressure needed for ultrafiltration requires expensive equipment and much energy.

Any way to reduce the pressure needed in ultrafiltration can make water purification a cheaper and more accessible process. This is precisely what the Computing for Clean Water Project is ultimately aiming to achieve, by first studying in detail how the water molecules flow through filters.

Display similar help items

What is the expected practical outcome of this project?

The research is primarily driven by a desire to understand, at a fundamental level, why experimental results show that water can flow through some nanotube filters far more easily than expected according to the classical laws of hydrodynamics. By getting a better understanding of these fundamentals, the research aims to shed light on ways in which such filters could be improved even further, and lead to more affordable and more energy efficient types of water filters for cleaning and desalinating water.

Display similar help items

Are any animals involved in Schistosoma transmission?

Fresh water snails are the primary carriers of Schistosoma. In addition to humans, other wild and domestic animals can be infected, in some cases acting as a reservoir for the disease.

Display similar help items

What causes a decrease in the levels of dissolved oxygen?

Nutrients and sediment flow into the Bay from the land areas surrounding the Bay. These nutrients, such as nitrogen and phosphorus, lead to the growth of algae. This is commonly seen as a green film on the water’s surface. As these algae die, decompose, and sink to the bottom, they remove oxygen leaving the water with insufficient levels of oxygen to support life.

Display similar help items

How does reverse osmosis work?

To understand reverse osmosis, first consider osmosis. Osmosis is the movement of a solvent, such as water, through a semi-permeable membrane to equalize the concentrations of a solute, such as salt, on each side of the membrane. If, for example, unequal concentrations of salt solutions were placed on each side of a suitable membrane, the water would move from the lower concentration side through the membrane to the side with the higher concentration of salt. By applying pressure on the higher concentration side, the process can be reversed, hence reverse osmosis. So reverse osmosis can effectively reduce salt concentrations, and is one of several major approaches used to remove salt from seawater.

Display similar help items

How much computing power does this project need, and why?

Based on the molecular dynamics simulations that the researchers have done up to now, using a cluster of 20 nodes (160 CPU cores) for a couple of months at a time, they estimate that to extend the simulations to water-flow velocities typical of practical nanotube filters, they will require another factor of 400 or more in compute time. And to simulate a representative range of membrane pore sizes would require a further factor of 10, for a total of order 106 thousand single-core-CPU-years. Add on to this a wide variety of contaminants they would like to add to the water in the simulations, and the sky is the limit!

Of course, the researchers will have to go one step at a time, and a lot of the computing effort will be to verify previous results at each stage and to make sure the results are reliable.

Display similar help items

What are microorganisms?

Microorganisms are microscopically small life forms, mostly single celled, and include bacteria, archaea, protozoa, yeasts and microscopic algae. Members of these diverse groups are present in almost all environments on earth: in the air, water, earth, rocks, and even where conditions are very harsh, such as the deep ocean and polar environments. They play a crucial role in maintaining all ecological systems and interact closely with one another and with other life forms. They are present in and around other living systems, such as plants, animals and humans.

Display similar help items

Where will the results of this research be published?

The researchers expect to publish papers in a number of academic journals. Typically, Physical Review Letters and Applied Physics Letters are targeted for similar sorts of research topics. Really big breakthroughs might get into prestigious multidisciplinary journals like Science and Nature. Of course, for any publication that gets accepted, the volunteers on Computing for Clean Water will be the first to know, and will be duly acknowledged in the articles.

Display similar help items

What is a General Circulation Model (GCM)?

A GCM is a global, three-dimensional computer model of the climate system, which can be used to simulate the earth's climate. GCMs are highly complex and represent the effects of such factors as reflective and absorptive properties of atmospheric water vapor, greenhouse gas concentrations, clouds, solar heating, sea temperatures and ice boundaries. The most advanced GCMs include global representations of the atmosphere, oceans, and land surface.

Display similar help items

What are climate model parameters?

Climate model parameters are numbers that quantify certain factors in the rules of a climate model. Quantities related to land surface types such as vegetation, land, water, or amounts of atmospheric convection, etc. are examples of climate parameters. Climate model parameters also include the specification of factors that are not simulated but rather prescribed, such as the amount of rain from a given amount of humidity, wind and temperature.

Display similar help items

How will the Computing for Sustainable Water project address the problem?

This project will, via a detailed simulation model of the entire Chesapeake Bay Watershed, test the impact of a large number of Best Management Practices (BMP)over a 20-year period. The proposed BMPs will be tested individually and in combination to assess their potential to effectively reduce the flow of nutrients and sediment into the Chesapeake Bay. Each of the various BMPs is expected to reduce the overall level of nutrients flowing into the Bay to some extent. However, scientists have no way of knowing in advance exactly how effective each might be. The project will provide an answer allowing policy-makers to choose those BMPs that will have the greatest impact.

Display similar help items

How will the results of this project help other watersheds and catchments?

The Chesapeake Bay Watershed is but one of over 400 major watershed/catchment systems globally. It is not unique in facing the challenges of population growth, increasing urbanization, and the challenges of changing environmental conditions. The results to be reported from this project can inform policy-makers worldwide as to best practices to employ to restore and sustain the globe’s precious water resources. More importantly, perhaps, information from this simulation can help citizens make better choices and help the private sector identify opportunities for new products, services, and processes that reduce nutrient flow.

Display similar help items

What will this project do?

The project will compare about 200 million proteins encoded by the genes from a wide variety of known and unknown organisms. These genes came from organisms in samples taken from a range of environments, including water and soil, as well as on and in plants and animals. DNA from all the organisms in those samples (the metagenome) was extracted and analyzed to identify genes that encode proteins, most of which are enzymes. Uncovering Genome Mysteries will compare the proteins encoded by those genes to one another, both individually and in groups, to find genetic similarities. Such similarities can reveal the functions these organisms perform in various natural processes. Scientists can then use that knowledge to design solutions to solve important environmental, medical and industrial problems.

Display similar help items

What kinds of actions might be taken?

There are various methods that can be adopted to reduce nutrient and sediment loads reaching the Bay. These are collectively known as “Best Management Practices (BMPs).” For example, farms can leave buffer areas between planted fields and bordering streams. They can minimize their use of fertilizers, plant cover crops in the winter (to absorb excess nutrients), and provide for the proper removal of animal waste. Municipalities can provide for separate storm water runoff systems and increase the frequency of street cleaning. Even individuals can affect change through household practices (reducing lawn fertilization, planting trees and shrubs) and limiting automobile driving. These types of actions are called non-point sources because they collectively contribute to the nutrient and sediment problem but are not individually measurable. Manufacturing, power generation, and other industrial contributors are point sources and these are regulated through existing policy and laws.

Display similar help items

How do I keep from getting H1N1 influenza?

There is no guaranteed way to avoid getting influenza. However, the following CDC guidelines of everyday actions could help you stay healthy:

  • Cover your nose and mouth with a tissue when you cough or sneeze, and throw the tissue in the trash after you use it.
  • Wash your hands often with soap and water, especially after you cough or sneeze. Alcohol-based hands cleaners are also effective.
  • Avoid touching your eyes, nose or mouth. Germs spread that way.
  • Stay home if you get sick. CDC recommends that you stay home from work or school and limit contact with others to keep from infecting them.

Display similar help items

How do you make nanotube membranes?

A nanotube membrane typically consists of nanotubes that have all been aligned in one direction, like the bristles of a brush, embedded in another material that is impermeable to water. One recipe for making such membranes is to first grow the carbon nanotubes on a silicon surface so they all stand up on end. This can be done by first putting nanoparticles of a metal like nickel on the silicon, then letting a chemical vapor containing a carbon compound react with the nickel catalyst, resulting in the carbon growing out of the particles as nanotubes.

Once the nanotubes have been grown, a thin film of silicon nitride is deposited around them, so the nanotubes are embedded in it. Silicon nitride is an insulating material similar to glass. Then the underlying silicon is etched away with a chemical that does not affect the silicon nitride nor the nanotubes, leaving a free-standing membrane of nanotubes embedded in silicon nitride. Finally, etching in a vacuum chamber with reactive ions removes the closed ends of the nanotubes, so that nanometer scale pores through the membrane open up.

Display similar help items

HPF1 vs. HPF2: Solvation - modeling the protein in water at higher resolution

Another major challenge with high-resolution methods is the difficulty of computing accurate potentials for atomic-detail protein modeling in solvent; with electrostatic and solvation terms being among the most difficult terms to accurately model. Full treatment of the free energy of a protein conformation (with correct treatment of dielectric screening) is not a problem with an efficient solution and the computational cost of full treatment of electrostatic free energy (by solving the Poisson-Boltzmann or linearized Poisson-Boltzmann equations for large numbers of conformations) is high. In spite of these difficulties several studies have shown that refinement of de novo structures with atomic-detail potentials can increase our ability to select and or generate near native structures. These methods can correctly select near native conformations from these ensembles and improve near native structures, but still rely heavily on the initial low-resolution search to produce an ensemble containing good starting structures (HPF2 like methods rely on initial search with HPF1 like methods) (Lee et al. 2001; Misura and Baker 2005; Tsai et al. 2003). Some recent examples of high res predictions are quite encouraging, and an emerging consensus in the field is that higher resolution de novo structure prediction (structure predictions with atomic detail representations of side chains) will begin to work if sampling is dramatically increased (thus the grid!). The solvation score is depicted in one of the three score panels in the HPF2 client.

Display similar help items

HPF1 vs. HPF2: Scoring different structures at higher resolutions

Balancing resolution with computational efficiency:
Protein structure prediction procedure must strike a delicate balance between the computational efficiency of the procedure and the level of physical detail used to model protein structure within the procedure. Low-resolution models can be used to predict protein topology/folds and sometimes suggest function (Bonneau et al. 2001b). Low-resolution models have also been remarkably successful at predicting features of the folding process such as folding rates and phi values (Alm and Baker 1999a; Alm and Baker 1999b). It is clear, however, that modeling proteins (and possibly bound water and other cofactors) at atomic detail, and scoring these higher resolution models with physically derived, detailed, potentials is a needed development if higher resolution structure prediction is to be achieved. Recent progress has focused on the use of low-resolution approaches for finding the fold followed by a refinement step where atomic detail is added (side chains added to the backbone) and physical scoring functions are used to select and/or generate higher resolution structures. Several recent studies have illustrated the usefulness of using de novo structure prediction methods as part of a two stage process in which low-resolution methods are used for fragment assembly and the resulting models are refined using a more physical potential and atomic detail (e.g. rotamers) to represent side chains (Bradley et al. 2003; Misura and Baker 2005; Tsai et al. 2003). In the first step Rosetta is used to search the space of possible backbone conformations with all side chains represented as centroids. This process is well described and has well characterized error rates and behavior. High confidence or low scoring models are then refined using potentials that account for atomic detail such as hydrogen bonding, van der Waals forces and electrostatics.
One major challenge that faces methods attempting to refine de novo methods is that the addition of side-chain degrees of freedom combined with the reduced length scale (reduced radius of convergence) of the potentials employed require the sampling of a much larger space of possible conformations. Thus, one has to correctly determine roughly twice the number of bond angles to a higher tolerance if one hopes to succeed.

Display similar help items