How many people have opened their desktop PC and replaced their CPUs (with a faster one ?) ?
I did - once - on an old laptop several years ago to extend it's useful life a little, but not on any desktop system.
How many people have opened their desktop PC and replaced their CPUs (with a faster one ?) ?
I did - once - on an old laptop several years ago to extend it's useful life a little, but not on any desktop system.
Let's see here.. all in all I'm
1) If a processor "dies", you still have another one to use.
2) AMD/Intel... Because sometimes you run across the occasional game or app that throws fits on an intel system that doesn't on AMD (or vice versa).
3) Because they operate in similar, yet different ways, the user could take advantage of processor specific apps that excelled under either processor.
1- This is done in aerospace electronics quite a bit. Though they do run the second and third processors concurrently instead of bringing a backup online in case of failure. It's like having your spare CPU mirror the primary CPU. If and when one dies, the system continues on like as if nothing happened. But I don't think consumers will pay the extra cost for such a system. Even if it adds only $150 to the cost of the motherboard.
2- If something isn't working, the solution is a software patch. Rarely is code released to consumers that is processor specific. It would reflect badly on the developers. Modern compilers work around any micro-architecture differences. It would reflect even worse on the CPU maker - that their chip can't run so & so software. The mfg's go to extraordinary lengths to assure compatibility. You're more likely to have this problem with graphics cards, and even then, with standardized API's, the hardware layer is abstracted out the discussion.
I remember it *used* to be that way in the beginnings of the 3D gaming market back in 90's and early 2000. Certain things for MMX and SSE and AMD 3Dnow, not forgetting openGL or GLIDE or DIRECTX. And same thing, with the 486-Pentium-Pentium II transitions, it was about the instruction sets. And features in the motherboard architecture, AGP and what graphics cards were available on those mobos.
Today, it's all about the OS and companion API's and standards. Hardware is really abstracted out of the picture. IMHO
How many people have opened their desktop PC and replaced their CPUs (with a faster one ?) ?
On my last remaining homebuilt, I would say 5 times. But that's out of the ordinary, not mainstream, an aberration. Each successive upgrade giving less and less performance than the prior. And each upgrade highlighting the shortcomings & bottlenecks of the remainder of the system. On all other desktops and friends' desktops, as soon as the system couldn't handle the software they wanted, it always resulted in getting a whole new rig. There was no fuss or fretting over researching replacement CPU's or other parts.
IMHO, the only worthwhile performance upgrades are exactly not for that! Performance! Err.. Let me state it this way, it is not cost effective to do speed enhancements. Speed requires too much infrastructure to be ripped out. But things like storage and functionality, yep. Those are always cost effective (and very satisfying upgrade paths). Things like HDD (SSD), printers, monitors, stuff like that. You always get a good bang for the buck. Not so much with processor and ram, for that means a new motherboard and power supply... And the whole conflagration snowballs into a new system at much higher cost than chucking the stuff in the first place.
However, on the other hand: Buying a new computer is good for the economy. It pushes GDP higher.
And creates a lot of e-waste. So that's the other side of the coin.
I've had to and not knowing how caused me to have to buy a new computer
Whoops..
How many people have opened their desktop PC and replaced their CPUs (with a faster one ?) ?
Many times.
AMD: We're not abandoning socketed CPUs
Is it just a rumour, then...
http://www.maximumpc.com/article/news/intel_says_company_committed_sockets2012
I used to buy desktops because they could be upgraded. But I seldom did. So, since 2000 I never have used a desktop PC anymore. Too bulky and too heavy. Long live the laptop/notebook.
I used to buy desktops because they could be upgraded. But I seldom did. So, since 2000 I never have used a desktop PC anymore. Too bulky and too heavy. Long live the laptop/notebook.
I've never used a laptop as main PC and probably never will (at least not in a long long time). Gaming sucks on laptops and it's somehow uncomfortable to use. Can't be upgraded and it lacks the performance of desktop PC (unless you spend ~2000€ on the laptop).
I also prefer desktops.
I used to prefer desktops, because of a "big" CPU & GPU. Today, now, I like laptops. And their graphics are good enough for all but the most sophisticated of games. Integrated on-chip graphics all the way!
TLDR edition:
Having to constantly change out a graphics board and sometimes a CPU just to enjoy the latest and greatest game from 'Big Box' retail shelves got to be too much for me. I couldn't take it anymore. The $1,200 GPU+CPU combo was getting ridiculous. Not to mention fighting against drivers that would work with one thing but not the other.
If its one thing, the stability of a laptop and integrated graphics is superior to any homebuild. Integrated graphics (especially on-die varieties) are held to a higher standard. It would do no good to have intel's or AMD's latest CPU to be crashing when it plays a simple dumb-ass computer game. Whereas with a card, you have to try updating the drivers, different slots, different varieties. Tech support can bounce you back and forth between the card maker - graphics chip maker - motherboard mfg - or software publisher! With a laptop, this is not so. When any one part in a laptop experiences incompatibility, the whole computer "takes the blame"! And it is this which is a powerful force in ensuring compatibility. There's a lot invested in making a reliable rig. Not so with a desktop or especially homebuilt machines.
All those problems are engineered-out and taken care of when you go with integrated graphics. And driver authors are more likely to make their product work rather than eeking out 1-2 FPS for sake of MPC benchmarks.
And, now with buggy games (and patches) being released more often. It's best to wait a year or two for the trouble-free versions anyways.
And their graphics are good enough for all but the most sophisticated of games.
Not really.
TLDR edition:
Having to constantly change out a graphics board and sometimes a CPU just to enjoy the latest and greatest game from 'Big Box' retail shelves got to be too much for me. I couldn't take it anymore. The $1,200 GPU+CPU combo was getting ridiculous. Not to mention fighting against drivers that would work with one thing but not the other.
Haven't been fighting with drivers in years. Usually just with some weird/rare/old hardware (not GPU) and CPU doesn't need drivers (well the first dual cores had some optimization drivers but..). You can get a great CPU+GPU combo for 500$-700$. You don't need to pay 1200$. Also upgrading from single core -> dual core gives a great performance. So does 2core -> 4core, but not as notable in daily use.
If its one thing, the stability of a laptop and integrated graphics is superior to any homebuild. Integrated graphics (especially on-die varieties) are held to a higher standard. It would do no good to have intel's or AMD's latest CPU to be crashing when it plays a simple dumb-ass computer game. Whereas with a card, you have to try updating the drivers, different slots, different varieties. Tech support can bounce you back and forth between the card maker - graphics chip maker - motherboard mfg - or software publisher! With a laptop, this is not so. When any one part in a laptop experiences incompatibility, the whole computer "takes the blame"! And it is this which is a powerful force in ensuring compatibility. There's a lot invested in making a reliable rig. Not so with a desktop or especially homebuilt machines.
All those problems are engineered-out and taken care of when you go with integrated graphics. And driver authors are more likely to make their product work rather than eeking out 1-2 FPS for sake of MPC benchmarks.
And, now with buggy games (and patches) being released more often. It's best to wait a year or two for the trouble-free versions anyways.
Laptop drivers suck. Manufacturers stop developing them very early and you can't use normal drivers eg. GPU drivers from AMD/NVIDIA websites. Also you can't really overclock or adjust/tweak as much.
If one part of laptop breaks, you have to wait for maintenance for the whole laptop. With (especially custom built) desktop PCs you can quickly change a "spare part" if you have one or go and buy a new component. Easily customized, fixed (IMO), upgraded etc.
One thing I hate about laptops/desktops is that BIOSes suck because all options are hidden. I have always found this behaviour to be quite stupid btw.
Go with UEFI, it's glorious.
I worked on a friend's computer that had it, and decided right there never again with BIOS
One thing I hate about laptops/desktops is that BIOSes suck because all options are hidden. I have always found this behaviour to be quite stupid btw.
This is usually with laptops and low price motherboards (or old) and in "market" PCs (not custom-built PCs).
Go with UEFI, it's glorious.
I've UEFI. But I still had many options in my previous motherboard's BIOS.
i have one time replaced the cpu against a faster model for ... 18 years (pentium-time) :-)
the other times, I changed the board completely
Counterpoint!
Today's cpu integrated graphics runs all kinds of games, except for things like Crysis 4. I can certainly play X-Plane flight simulator, Orbiter spaceflight simulation, Doom, Need for Speed, World of Warcraft, or all my classic arcade games, and more. That's good enough for like 95% of computer users today.
Advanced gamers will upgrade and replace hardware on 6 month cycles. Often times buying a new CPU+GPU and an entire entourage of parts which adds up to more than $1,200
Laptop graphic drivers aren't updated as much or for as long, there isn't a need for those shenanigans when you get the job done right in the 1st place. By shenanigans I mean too many companies, both hardware and software vendors alike, are releasing things before the bugs are worked out. Pushing buggy software into the hands of users is too common today. Hence all the updates.
Tweaking? If you're a hobbyist that's all well and fine. Nothing but the biggest baddest heat-spewing cooling system will do. The loudest fans are a must.
One thing I hate about laptops/desktops is that BIOSes suck because all options are hidden. I have always found this behaviour to be quite stupid btw.
Well, you certainly don't need to be changing bios options on a system that doesn't have changeable hardware. A laptop is like a smartphone in this case. You probably don't access your smartphone bios screen that much either.
Well, you certainly don't need to be changing bios options on a system that doesn't have changeable hardware. A laptop is like a smartphone in this case. You probably don't access your smartphone bios screen that much either.
Nope. Some bioses have some options set to default (shadow ram and or cached ram) that may conflict with newer Operating Systems. I have seen some of these. And some people would like to tinker with other options too. Some bioses have hardcoded SATA operation mode for instance too.