I am often asked in the conversation about issues of phosphorus from agriculture and water quality standpoint, “Is rate the only concern?” If it were only that simple, we would likely already have seen reduced non-point source loads from a water quality monitoring standpoint. Most of us have reduced fertilizer inputs as fertilizer costs have increased and this is documented by reduced phosphorus fertilizer sales.
Rate, or more appropriately soil test level, is important from a water quality standpoint. If we are at a soil test level that does not require a fertilizer application, risk of loss is reduced to whatever background levels are coming from the soil. If the soil test is in an agronomic range (15 to 40 parts per million Bray P1), water concentration of Total P in runoff are 0.5 ppm or less. If soil test levels are four to five times agronomic levels, we can see this become a runoff concentration of one ppm or more. As a point of reference concentrations of P in lakes at 0.01 to 0.04 ppm will support algae growth.
Risks of edge of field loss of phosphorus are greater when the nutrient is applied. Recently applied P is subject to loss based on the timing, source and placement during the nutrient application. The key to reducing loss is for the applied phosphorus to quickly equilibrate and stabilize in the soil before the next runoff producing rain event. Surface applications are most at risk. Soil test levels on the surface can be higher than the “0-8 soil test sample result” under tillage systems with reduced mixing. Surface water, or water preferentially flowing to a tile system, can result in an elevated water concentration of P that mimics the water quality results of the high legacy soil test situation. The same rate of nutrient applied can have different water quality results based simply on soil placement.
Read entire story at: Click Here
Original Story from Ohio Country Journal – www.ocj.com