In addition to the data, there is this more or less philosophical point: radioactives "decay" ultimately to a stable isotope, yes?
Therefore (barring new material from nearby novas) the background radiation levels, planetside, are falling. Yes? (Yes, I know there are cosmogenic radioactives being created here all the time; cite me something that suggests they are increasing, and I'll consider them; meantime, all the primordial radioactives that Earth started with are decaying, for a net loss of background radiation.)
Therefore current life is adapted to slightly higher background radiation than it currently finds itself in; indeed, it must have originated in a notably higher background field.
This is a question where we must reason from inadequate data or from no data. I am very good at that kind of thing.
So first, you're right. In the absence of any radioactivity coming in from outside, or any new radioactive materials produced fresh, the radioactivity must diminish over time. As time goes on, the background is increasingly dominated by the longest-halflife isotope. Everything else dwindles away, leaving only that isotope and its daughters which tend to have a faster half-life.
Should we then assume that in the past the background radioactivity has gradually and slowly diminished, and that's all that happened? Well, no.
First, there's some evidence for periodic extinction events. The last time I looked there was so little data that there were multiple candidates for the period, so it was pretty weak data. But one of the wild guesses was that the period was the same as the time that Sol travels one time around the galaxy. And they hypothesized that there might be some radiation source which did not revolve with the galaxy, and every time we get close to it we get dosed. That would give us a big dose of radiation once in a long while, and the species that happened by accident to be ready for that would survive better than others.
There could be periodic or aperiodic bursts of radiation with a shorter average time. Like, once every so often humanity might develop the technology to make nuclear weapons and then bomb ourselves back to the stone age. The net long-run effect would be to reduce the radioactivity faster, as long-lived isotopes fissioned. But in the short run the background count would get kind of spiky.
Reversals of the earth's magnetic field might result in radiation spikes, or might not. The data is weak.
If bursts of ionising radiation happen frequently, like every 10,000 years or so, then it would be more important to have defenses that work well when that happens, than to have those same defenses make a small improvement in survival at other times. Individuals with defects in regulation of those defenses would tend to die in the hot time, and they would only start evolving toward new regulation in between times.
What can we conclude about the level of radioactivity when life was first evolving? Not much. Though it makes sense it was high if life evolved fresh on earth soon after earth cooled enough to have liquid water, and if the radioactivity was spread uniformly through the earth.
Could life today be poorly evolved to live in as low a background rate as exists today? Possibly. It depends.