Well, not exactly. You have to remove instructions at some point. That’s what Intel’s x86-S is supposed to be. You lose some backwards compatibility but they’re chosen to have the least impact on most users.
Well, not exactly. You have to remove instructions at some point. That’s what Intel’s x86-S is supposed to be. You lose some backwards compatibility but they’re chosen to have the least impact on most users.
I also haven’t wanted an Intel processor in a while . They used to be best in class for laptops prior to the M1, but they’re basically last now behind Apple, AMD, Qualcomm. They might win in a few specific benchmarks that matter very little to people, and are still the default option in most gaming laptops. For desktop use the Ryzen family is much more compelling. For servers they still seem to have an advantage but it’s also an industry which requires longer term contracts that Intel has the infrastructure for more so than it’s competitors, but ARM is also gaining ground there with exceptional performance per watt.
Exactly. Adding a third should be much simpler than a second.
I know, it just would have been wrong to say no battery at all
As a fellow risc-v supporter, I think the rise of arm is going to help risc-v software support and eventually adoption. They’re not compatible, but right now developers everywhere are working to ensure their applications are portable and not tied to x86. I imagine too that when it comes to emulation, emulating arm is going to be a lot easier than x86, possibly even statically recompilable.
I’m both surprised and not surprised that ever since the M1, Intel seems to just be doing nothing in the consumer space. Certainly losing their contract with Apple was a blow to their sales, and with AMD doing pretty well these days, ARM slowly taking over the server space where backwards compatibility isn’t as significant, and now Qualcomm coming to eat the windows market, Intel just seems like a dying beast. Unless they do something magical, who will want an Intel processor in 5 years?
All else being equal, a complex decoding pipeline does reduce the efficiency of a processor. It’s likely not the most important aspect, but eventually there will be a point where it does become an issue once larger efficiency problems are addressed.
We stuck to x86 forever because backwards compatibility and because nobody had anything better. Now manufacturers do have something better, and it’s fast enough that emulation is good enough for backwards compatibility.
With almost no battery even
Many of the people complaining about a feature they would just disable and never use are also the same kinds of people who would complain about basic accessibility features and call them “unnecessary bloat”.
How do any of those things have anything to do with LLMs? You’re just listing a bunch of random tech that isn’t particularly impactful and claiming that another unrelated thing must be a failure.
I’m with you on this one. I love Lemmy, but it’s a small community here and skews towards a very specific foss tech nerd demographic that doesn’t represent the general population in any way. It seems like most users are aware of that but not everybody is self-aware enough to realize that. I like trying out AI features, I like to see them be integrated into software so they can be more useful. They’re not perfect at all but just because they’re not perfect doesn’t mean they should be abandoned in their entirety.
I have a Mac set up at work for CI testing with no Apple ID or payment associated with it. Can’t use the App Store but I don’t need to for the C/C++ build tools and anything from homebrew. Updates install without issue.
I have an M2 MBA and it’s the best laptop I’ve ever owned or used, second to the M3 Max MBP I get to use for work. Silent, battery lasts all week, interface is fast and runs all my dev tools like a charm. Zero issues with the device.
Someone who is buying a MacBook with the minimum specs probably isn’t the same person that’s going to run out and buy another one to get one specific feature in Xcode. Not trying to defend Apple here, but if you were a developer who would care about this, you probably would have paid for the upgrade when you bought it in the first place (or couldn’t afford it then or now).
It’s a game of whack a mole. In the past I’ve been able to get it to work in India, but now YT India blocks foreign payment cards. Was able to set up a monthly subscription in Ukraine recently using my foreign credit card. The taxes support the war effort I guess.
With FreeVPNs, probably, but otherwise it’s not too big of a deal. Once in a while some specific sites will be broken, like archive.is recently would force you into an infinite captcha (which was really annoying because I couldn’t read many archive links posted here). Some big sites that are targets for various attacks will use a captcha in the login process, but once you do it it goes away.
Nowadays windows will update UEFI and firmware for many devices through windows update. Most users have no idea what a UEFI is or how to manually check and update device firmware, so this is a big win for security. Linux users can do the same with fwupd which comes installed on many popular distros and is integrated into the software manager apps from Gnome and KDE, making the experience largely the same.
What’s the issue with Dell? Everyone I know at work with Dell laptops likes them. I’ve used XPS 15 and 13 in the past and they’ve been generally fine. Battery life sucked but I haven’t ever seen an x86 laptop with what I would consider good battery life.
Instruction decoding takes space and power. If there are fewer, smaller transistors dedicated to the task it will take less space and power.