I personally treat driver updates like firmware updates. Something to be dealt with only when there is a specific documented problem that has a known fix. Otherwise if it ain't broke, don't fix it.
Furthermore, drivers don't always improve with age. Features are sometimes dropped or changed which tend to do undesirable stuff from time to time. Sometimes they become bloated with excessive micro-management options.
I feel programs that do automatic in-the-background driver updates fall into the realm of registry cleaning and pc optimizers. Busywork. Makework. Psychological feel-good material. Utilities such as these waste resources and cause more problems than they fix. Once you've got a good configuration going, why mess around with it? Why destabilize it? And with constant updating, you can easily become confused as to where a problem lies. Is it with the system? Hardware? Problem in the app itself? Or them spiffy new drivers that were loaded some weeks ago?
One more thing, since I image my system 2 or 3 times a year as part of a backup plan -- A driver update is reason to trigger an immediate do-it-now backup. This way I can go back exactly as before. Or run with the new configuration if it's looking good.
Having said all that, the last time I updated drivers on one of my systems was almost 3 years ago - to correct a known specific reproducible BSOD condition on a wireless network card.
Some softwares (like Photoshop, Premiere, AfterEffects, Flash, Flash Player) do perform better with the latest graphics drivers. When a new driver becomes available on my graphics card's manufacturer's site, I'll update. Needless to say that I create a System Restore Point before that.
I am much more cautious when Windows Update offers driver updates. Last time I let that install (two weeks ago) it gave me an immediate BSOD. Fortunately Windows Update also creates an automatic System Restore Point.
I've always wondered why they can't make working drivers for videocards from the get-go? I used to update them 4 or 5 times a year too. And that really turned me off of cutting-edge pc gaming.
only exception i can think of is for the son's video card whenever he gets a new game and requires the latest nvidia update.
I'll never buy another Nvidia display card, not after messing with their buggy drivers that cause visual glitches for the past few years, strangely with the drivers removed the glitches are 100% gone but Windows is so slow at drawing without them. I had switched over from ATI thinking their buggy drivers were dreadful but the other side was even worse than I could've ever expected.
Cyan, the company who made Myst games had to coerce nVidia to update their drivers in order for their cards to support Cyans' later Myst games. I was surprised at how fast nVidea reponded to them.
In talking just now with an industry insider I was told that while there may be new features in the control panel or UI changes, the upgrade "need" comes from a game either crashing or producing a distorted image. Bugs crop up when documented hardware features are actually used by some game. Or used incorrectly by either the driver or the game.
Additionally you generally don't encounter this update rat-race with professional cards and professional applications (modeling and GP GPU computing). Drivers for these are put through some serious testing. For real. But with gamer-grade drivers and hardware, you do the beta testing. And you are happy to do it - or so thinks the company. Gamers = tweakers = early adopters = willing beta testers = flunkies = gullible users. And gamers care about speed and benchmarks, not stability and reliability, or so the company believes.
And that is why a pro-level card is so much more costly. You are paying for verification. And it makes sense.
I'll never buy another Nvidia display card, not after messing with their buggy drivers that cause visual glitches for the past few years, strangely with the drivers removed the glitches are 100% gone but Windows is so slow at drawing without them. I had switched over from ATI thinking their buggy drivers were dreadful but the other side was even worse than I could've ever expected.
Yep. With no drivers loaded your fancy schmancy videocard is nothing but a framebuffer for the CPU. Windows and the CPU are doing it old-school style like in the days of 8086 and 80286. Windows 3.1 was the early era of videocard drivers. Everything went to hell in a handbasket once 3D games came on the scene. 3Dfx and Rendition and S3 weren't all that bad, especially since they were among the first. But Nvidia is just terrible, then and now.
Speaking of updating drivers, just hours ago I updated my Kodak AiO Printer Scanner drivers. Knock on wood all is going very well, and it actually fixed a bit of a nuisance the old version had.