They've been able to show some small variations in rate of decay, but it's many orders of magnitude less than what would be required to get 4.5 Billion years worth of apparent decay in just 6,000 years. The thermodynamics of having that much radioactive compressed to a timespan of a few thousand years would mean the Earth would still be millions of years away from having cooled down enough to even thave a crust.
Trying to explain it as "leaching" doesn't explain how samples taken from deep solid rock formations show the same ratios as samles taken near the surface of the formation.
If the Earth is only 6,000 years old, it should be relative easy to produce a sample of uranium ore consistent with only having undergone 6,000 years of decay. I haven't heard of anyone finding one. Have you?
If any of the assumptions I stated were taken individually, sure, you could still say that this specific assumption, by itself, cannot account for the apparent age of the sample,
but all the assumptions are in effect in every age determination.
And I notice you had no comment on why different radiometric dating methods (uranium-lead, lead-lead, potassium-argon) often produce wildly varying ages on the same sample, resulting in a guess based on the index fossils (assumed to be of a certain age) located near the sample.