Based on what you're describing, I would say something changed with the refresh rate of the display, whether it be a new panel was installed or something changed on the computer side.
If a new panel was installed, it could be that the old one only had a refresh rate of, say, 60Hz, which means there are 60 frames of video per second being displayed on the screen. With the money amounts falling at such a rapid rate, 60fps can literally only show 60 individual dollar amounts per second, which may come across as a blur on screen. Compounding this issue is the fact that the camera is filming at 59.94 fps, interlaced. I won't go into the technical details of why that's bad, but long story short, you're going to get some blur when filming a screen displaying at 60Hz, as the camera and the display may not be in perfect sync with each other. However, a higher refresh rate of 240Hz (for instance) would provide much smoother video that cameras can capture without as harsh of artifacts, like blurring.
On the other hand, the display may have already had a high refresh rate, but the computer driving it was sending a slower signal than what the display was capable of showing, so they made some changes on that side of things. The end result is the same: less blur.
Regardless of the root cause of the change you've noticed, assuming any change was made at all, Time is Money is a game where things like refresh rates probably matter when you have dollar amounts falling at such a rapid rate. A few hundred dollars being lost in a fraction of a second is something you want to be able to see clearly, so as to avoid any potential S&P issues should that 'blur' be perceived as the show taking advantage of technological shortcomings in order to cheat contestants out of winnings -- or at least, that's the only reason I can think of. This also makes me wonder if the show has a slow motion camera or uses special software to ensure the game isn't rigged should an issue arise.