Drought confuses smart-irrigation controllers
The summer of 2011 was not only hotter and drier than normal, but conditions interacted to cause plants to use from 30 percent to 50 percent more water than they would in an average year, Fipps explained. While the amount of solar radiation (total energy received from the sun) remained close to normal levels, temperatures and wind were significantly higher.
In the 2010 tests, it was too much rain that caused problems with some smart controllers, he said. The 2010 work tested eight controllers from six different manufacturers.
“In 2010, it was only the smart controllers that were equipped with tipping-bucket rain gauges that were able to accurately provide the right amounts of irrigation,” Fipps said.
For the tests, Fipps and Swanson programmed each controller for a typical Texas irrigation system and landscape that included ornamental plants, shrubs and turf. They also considered various soil types with different root-zone depths.
“Programming these controllers was no easy task as only two controllers allowed us to input all the landscape parameters that were needed,” Fipps said. “Each manufacturer was allowed to come in and provide assistance in programming to ensure the controller programming most accurately described the landscape, which most manufacturers did.”
In the 2010 test, Fipps and Swanson added a “goldilocks” protocol, which interprets performance results to whether the controllers put on too much, too little or “just the right” amount of water.
“Adequate, inadequate and excessive categories make the testing results easier to understand by consumers and irrigation contractors who are trying to determine which controller to purchase,” he said.
The full results are included in the recent report on smart controller testing and performance found on the Irrigation Technology Center website at http://itc.tamu.edu/smart.php.