Recently, I was planning on purchasing a new "micro-PC" to use in our entertainment center to supplement our FireTVs and other media streaming devices. While it is quite amazing the amount of processing power that can now be packed in tiny devices such as cell phones, etc... in this case, the best option for us was to get a surplus "ultra slim" desktop that was approximately the same size as a DVD player. Despite being 6 years old... it is actually over eight times faster than the "micro-PC" that I was planning on purchasing, far more versatile, and much more expandable all for the same price, around $150. Of course it's processor uses 65 watts which is ten times more than the processor in the "micro-PC".
In consumer products much of the progress made in the past few years has been oriented more toward power savings and not making more capable devices. All part of the greenie revolution I suppose. But how does saving $20 a year in electricity really compare to using a hobbled device?
For most of the tasks we perform the added capabilities of more processing power probably do not matter much. But it was interesting that the old computer that I purchased for the home entertainment center was 1/5 the cost of my recently purchased laptop but 20% faster. Moore's Law seemed to have gone out the window. But the two devices were both purpose built and the 6 year old computer was purchased for approximately a tenth of its original price. So market values can definitely skew perception. I just thought this might be an interesting topic this morning.
99.99% of people would be just as happy with a 386.......................
Microsoft will make sure the next OS forced on the masses will cripple the capability of this high performance processor and any like it.
Or as a Major server animal, maybe in Mission Critical with another unit so no single point of failure. Doesn't look like for the average Joe or am I wrong?
Reductions in power consumption by CPUs is being done to increase the uptime of devices that run on batteries. That is why your smartphone battery lasts more than 15 minutes.
” ... it’s processor uses 65 watts which is ten times more than the processor in the “micro-PC ...”
About 6 years ago I bought a refurbished dell 4 core xeon server. I regretted it. Although it was more than fast enough for anything I wanted to do, it has a 950W power supply and lots of fans. Even the memory modules had fins and fans. Bottom line, I bought a less powerful machine to use for most of my stuff (~250W), because the faster than stink server was like running a noisy space heater next to my desk.
I really don't understand why people wait in line for the newest I-phone. Other than the new 3 lenses, how can they possibly make your life better with each year release?
Plus, I hate social media. It's destroying human interaction. Yes, I was on Facecrap, but cancelled it after so many people sending me stupid uninteresting mundane updates of their daily activities. No, I don't give a shite about what you had for breakfast or how many minutes you ran today or if you shampooed your pet or how many supposedly "friends" you have. Guess what dumbshites, just because they clicked a button doesn't me they are a real friend. Pfft!
>>In consumer products much of the progress made in the past few years has been oriented more toward power savings and not making more capable devices. All part of the greenie revolution I suppose. But how does saving $20 a year in electricity really compare to using a hobbled device?
Most of the devices you’re talking about are based on cell phone processors, and the drive towards power savings is a byproduct of that. They’re power saving because they’re intended for devices powered by batteries, and are being re-purposed for other uses.
The drive to cut wattage is mostly about heat. You can’t put a fan on something and have it be tiny, so you have to reduce wattage to reduce heat. Even with all the effort, my FireTV stick and Roku Stick both get pretty darn hot using passive cooling, you’d be able to fry and egg on them if they didn’t sacrifice some performance for heat considerations. Plus, cooler electronics live longer that electronics in hot environments. So, like in most things, there’s tradeoffs.
I’ve been, among other things, a PC system builder since the early 1990’s. One thing I have noticed is Intel processors “always” work while in a batch of 100 AMD processors there will a a couple duds. Last month, aganst my better judgement, I built a system for a friend using the latest Rizen processor. Sure enough the processor was bad. This past week Rock Star Games release Red Death Redemption 2 for the PC and it appears there is significant problems running with AMD processors.
You’ve been warned: Intel processors are more reliable. You get what you pay for...
I’ve been, among other things, a PC system builder since the early 1990’s. One thing I have noticed is Intel processors “always” work while in a batch of 100 AMD processors there will a a couple duds. Last month, aganst my better judgement, I built a system for a friend using the latest Rizen processor. Sure enough the processor was bad. This past week Rock Star Games release Red Death Redemption 2 for the PC and it appears there is significant problems running with AMD processors.
You’ve been warned: Intel processors are more reliable. You get what you pay for...
I was working just down the street at Signetics when AMD started up. We used to have drinks at the Wagon Wheel with Larry Stenger before they started AMD.(I must be getting really old. ahem)
Hi.
How do you cool the thing? Nitrogen?
5.56mm