The short answer is that in irrigation (the part of it the works and isn't wasted) the water soaks into the ground and some of it is taken up by plants whereas over large open bodies of water there's nothing to slow it down.
Example: two thirds of the precipitation that falls east of the Rockies, comes from the Gulf of Mexico. It's a little over 66%.
Irrigation water evaporation on the other hand, has yet to reach the tenth of one percent range. It has about as much impact as an extra quart of water going over Niagara Falls.
We can even estimate the temperature of the water being used, in general, for irrigation purposes, as well as how much of it is sprayed and so forth, and it turns into a simple mathematical exercise.
It is definitely not as simple as that. You don't take into account the ambient air temperature, the dew point (which effects how much can be evaporated), how much of it is absorbed by the soil, winds, which can change the whole equation, the amount of insolation (sunlight) and dozens of other factors it would take too long to list.
If weather/climate research were as simple as you make it our to be, we'd already have our forecast for Easter 2020.
Concerning "soaking into the ground", it does that ~ INITIALLY ~ and then it is sucked up by the plants. Much of it evaporates directly ~ that's why irrigation leaves behind salt damaged land.
Of course weather and climate are not the same thing. Weather is a relatively short time period affair, with changes more or less averaging out over longer periods. The lack of complete averaging over those longer periods is climate change. Weather is also a geographically limited thing, while climate may also be, but for larger regions. Obviously the two are related and an understanding of one will help with understanding of the other. Weather is probably the more difficult theoretical problem, but climate has more unknowns, some of them quite likely of the "unknown unknown" variety.