![]() ![]() I think the single core Geekbench numbers are often the most relevant to how fast they "feel". I also like that it essentially looks the same as recent Intel MacBooks, so it isn't a special target for theft and doesn't look like I spent a ton of money on it. It's the same size and weight as my old 13" MacBook Pro, but there is no beachball, no fan noise, and the battery lasts forever. Instead, all of the minor annoyances of my older laptop went away. The best thing about the upgrade is that it hasn't been an exciting at all. I should have just gone ahead and upgraded earlier. I put off upgrading a 2016 13" MacBook Pro, thinking that the second iteration of the M1 would be worth the wait. It depends on how much the screen size matters to you, though an iPad using Sidecar might be worth looking at for additional screen real estate. I can understand who likes MacOS and why they'd like it, but if you do anything other than text editing or media encoding for a living then the benefits of using MacOS start to deteriorate very quickly. It certainly doesn't look appealing to a gaming crowd or people writing software for Windows (with M1 cross-compiling is dead in the water), it can't do Boot Camp anymore and it's a nightmare for DevOps folks who want to run Docker tests on their laptop. It's overall less compatible with software than Windows or even Linux, and if you want to do development outside of the Apple ecosystem or besides the web, getting a proper development environment up and running is an exercise on the treadmill of neverending maintenance. But there's a number of consumer markets who would indeed drag MacOS for lacking 32-bit support, decent container compatibility or a package manager. Would it not depend on the workload? I hear a lot of web developers and content creators who share this sentiment fair enough. > I dont think any reasonable evaluation could call macOS 'crap overall'. That's the part 2 of the article you linked. This is an improvement compared to the M1, which has no hardware decode at all for AV1, and uses even more than 2W to playback AV1 at 4K. And those CPUs have only partial hardware decode for AV1. > Early Intel CPU requires 2W+ for AV1 4K60 content hardware decode. ![]() Is your theory that AMD made a video decoder for smartphones that is 4x more efficient and just decided not to include it in their other GPUs? > And what has sharing iGPU design got to do with Video Decoder IP? ( If you are referring to Samsung and AMD ) The point is A video decoder that uses 500mW on Laptop is still more than 150mW on Smartphone.īecause the video decoder is part of the iGPU. If you want to make these comparisons, you have to compare wall numbers. Vague, filtered numbers for powermetrics just doesn't cut it. Not only that, but you will also get the actual power measured by the battery BMS at any given instant.Įven this is still taken with a huge asterisk and people still measure actual power draws, because even this level of granularity isn't enough. For example, on any modern PC laptop, you will get the exact power rail where the voltage and current is measured, and not only that, you will get the point at which this power is measured - before or after which VRM, exactly? It is unclear and not nearly granular enough compared to the tools used in the PC world to be able to trust it. So clearly, and unequivocally, powermetrics is unreliable. This is pretty ridiculous - there is nothing that's going to be drawing power outside of the SoC and the screen, and the screen is not drawing 7.4W.Īnadtech also did a wall power measurement of the M1 Mac Mini and found a 4.2W idle. Not only that, but powermetrics reports an idle of 200mW package power, while the actual laptop is drawing 7.6W at idle from the wall. If you look at Anadtech's testing, you'll see that powermetrics sometimes overshoots wall power, and sometimes undershoots it drastically. The bigger issue is that, on M1 Macs, powermetrics is just unreliable. ![]() The "Other" category in the legend is supposedly package power, but it's less than the sum of other rest, so it's either inaccurate or the legend is wrong. This is made evident by the fact that powermetrics reports the GPU power consumption during 4K60 VP9 decode to be 2mW.Īnd it certainly isn't being accounted to the P-cores or E-cores, so all you're seeing is DRAM usage. How do you know that powermetrics accounts HW decode properly to GPU or CPU and not the SoC/Other section. ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |