Why There is a Requirement For Laboratory Grade Water

Even for the simplest and most routine task in laboratories such as rinsing and washing glassware, the use of laboratory grade water is a must. The purity of water matters especially when it is to be used on sensitive applications. This is to eliminate the possibility of contamination and other possible reaction that can influence the outcome of a laboratory test or experiment and make the results unreliable. Hence, even the smallest and tiniest drop of impurities is an important concern in the laboratory writing essay service

The water straight out of the tap usually contains microorganisms, endotoxins, salts and other forms of impurities that can gobble an experiment. Water contaminants are essentially particulate matters, which can be filtered out by passing the water through a sieve with pore size that is smaller than the contaminants. Another form of contaminant is the dissolved non-ionized gases and solids, which include man-made organic chemicals, natural organic remains, and oxygen, which result from the water’s exposure to ecological contaminants.

The last form of contaminants is the dissolved ionized solids and gases, which usually come from the water’s exposure to rocks and earth minerals such as limestone, also known as calcium carbonate, sodium chloride, and other soluble chemicals that occur naturally or result from mankind’s contamination of water supply.

There are three types of water used in laboratory applications; the primary grade, general laboratory grade and ultra pure water. The primary grade water is used for basic lab functions such as washing glassware and water on autoclaves. General laboratory grade water is used from washing glassware and other laboratory equipment to mixing reagents and dilution. Ultra pure water is a standardized grade of pure water that is used to meet the needs of any laboratory.

Laboratory grade water needs to be free from contaminants. Most forms of contaminants, especially ionized gas, contribute to the pH level, alkalinity, conductivity, and hardness of the water. Since pure water is needed in every venue of laboratory system in areas of research and clinical applications, a number of technologies have been developed to establish laboratory water purification systems.

The most common form of water purification is filtration, which has 5 classifications. Particle filtration involves anything from coarse sand filter to other filtering media with pore size greater than 1,000 microns. Microfiltration, otherwise known as sub-micron filtration, filters water with filtering media that has pores ranging from 1 to 0.05 microns, filtering out some forms of bacteria.

Ultrafiltration essentially involves the use of a membrane filter or molecular sieve, which can remove elements that are larger than 0.003 microns such as virus, pyrogen, endotoxin, D-nase and R-nase. Nanofiltration and reverse osmosis are usually used to separate water from specific ions.

Another technology used to decontaminate water is the adsoption by activated carbon, which uses activated carbon filters to catch organic compounds and chlorine. The use of ultraviolet radiation at certain wavelengths sterilizes the microorganisms and reduces the amount of organic compounds present in the water.

Distillation presents the oldest technology in purifying water, involving the process of heating water to its boiling point and condensing and collecting the water vapor. Lastly, the deionization process or ion exchange method involves the passing of water through resin beds that have an affinity for dissolved and ionized salts in the water.


Leave a Reply