To any Linux users, I recently bought a fully loaded M4 MacBook pro to replace my aging Lenovo and strongly regret it. I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up. I'll probably replace it with a framework at some point in the near future.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
I've used Macs for 20 years starting on the day 32-bit Intel Macs were released, and agree with the GP. Linux and Plasma spoiled me, going back to macOS and its windowing system feels like a step backward, especially for development, where using multiple windows is a must. Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
Yes, I know about Yabai and the other things that modify the existing window manager. The problem is the window manager itself.
Outside of the windowing system, running native Linux if you're deploying to Linux beats using an amalgamation of old BSD utils + stuff from Homebrew and hoping it works between platforms, or using VMs. The dev tools that are native to Linux are also nice.
When it comes to multiple monitors, I want a dock on each monitor. I can do that in Plasma, but I can't in macOS, unless I use some weird 3rd party software apparently.
> Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
That just sounds like being accustomed to one way of switching tasks, honestly. If I want previews, I use Expose (three-finger swipe up/down or ctrl-up/down). But mostly I just use cmd-tab and haven't really needed to see previews there. Because macOS switches between applications, not windows, often there isn't single window to preview, and I'm not sure showing all the windows would work well either. For Expose it works well because the it can use the entire screen to show previews.
Agreed, as a software engineer of ~8 years now Mac is actually my _preferred_ environment -- I find it an extremely productive OS for development whether I'm working on full stack or Unity game dev in my free time.
I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
> I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
For example, if I open a new Firefox window, the Mac seems to force the two Firefox windows onto different desktops. This already is a struggle, because sometimes I don't want the windows to be on two desktops. I find that if I try to move one window to the same desktop as the other, then Mac will move the other desktop to the original desktop so they are both still on different desktops.
OK, got sidetracked there on a different annoyance, but on top of the above, CMD-backtick doesn't usually work for me, and I attribute it to the windows typically being forced onto different desktops. Some of the constraints for using a Mac are truly a mystery to me, although I'm determined to master it eventually. It shouldn't be this difficult though. For sure, Mac is nowhere near as intuitive as it's made out to be.
My favorite is how it'll force move your workspace if you get a popup.
To reproduce, get a second monitor, throw your web browser onto that second monitor (not in full screen), and then open a application into full screen on your laptop's screen (I frequently have a terminal there). Then go to a site that gives you a popup for OAuth or a Security Key (e.g. GitHub, Amazon, Claude, you got a million options here). Watch as you get a jarring motion on the screen you aren't looking at, have to finish your login, and then move back to where you were.
> Mac are truly a mystery to me
Everyone tells me how pretty and intuitive they are yet despite being on one for years I have not become used to them. It is amazing how many dumb and simple little problems there are that arise out of normal behavior like connecting a monitor. Like what brilliant engineer decided that it was a good idea to not allow certain resolutions despite the monitor... being that resolution? Or all the flipping back and forth. It's like they looked at the KDE workspaces and were like "Let's do that, but make it jarring and not actually have programs stay in their windows". I thought Apple cared about design and aesthetics but even as a Linux user I find these quite ugly and unintuitive.
Or just put a program onto a second monitor then open a second window for that program. Usually it will not open in the same monitor. This is especially fun when you get pop-ups in browsers...
> Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
Only sometimes it doesn't work. (For me on a Norwegian keyboard it is CMD+<)
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
> That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
That's true, hence why I remap it to a "proper" key, above Tab with:
Apple really doesn't tell power-users about a lot of these features. You can really gain a lot by searching for Mac shortcuts and tricks. I still learn new things that have been around for over a decade.
I'd argue if you need to be told about keyboard shortcuts, then you're not a power user. (I.e., knowing how to find keyboard shortcuts I'd consider a core trait of power users).
As a hybrid macOS / Windows user (with 20+ years of Windows keyboard muscle memory), I found Karabiner Elements a godsend. You can import complex key modifications from community built scripts which will automatically remap things like Cmd+Tab to switch windows, as well as a number of other Windows hotkeys to MacOS equivalents (link below):
I'm sorry, I just really hate this Apple Fanboy rhetoric. It's frequent and infuriating. Don't get me wrong, I hate it when the linux people do it too, but they tend to tell you how to get shit done while being mean.
The biggest problem with Linux is poor interfaces[0] but the biggest problem with Apple is handcuffs. And honestly, I do not find Apple interfaces intuitive. Linux interfaces and structure, I get, even if the barrier to entry is a big higher, there's lots of documentation. Apple less so. But also with Apple there's just things that are needlessly complex, buried under multiple different locations, and inconsistent.
But I said the biggest problem is handcuffs. So let me give a very dumb example. How do you merge identical contacts? Here's the official answer[1]
Either:
1) Card > Look for Duplicates
2) Select the duplicate cards, then Card > Merge Selected Cards.
Well guess what? #2 isn't an option! I believe this option only appears if you have two contacts that are in the same address book. Otherwise you have the option "Link Selected Cards". Something that isn't clear since the card doesn't tell you what account it is coming from and clicking "Find duplicates" won't offer this suggestion to you. There's dozens of issues like this where you can be right that I'm "holding it wrong", but that just means the interface isn't intuitive. You can try this one out. You can try this out. Go to your contacts, select "All Contacts" and then by clicking any random one try to figure out which address book that contact is from. It will not tell you unless you have linked contacts. And that's the idiocracy of Apple. Everything works smoothly[2] when you've always been on Apple and only use Apple but is painful to even figure out what the problem even is if you have one. The docs are horrendous. The options in the menu bar change and inconsistently disappear or gray out, leading to "where the fuck is that button?".
So yeah, maybe a lot of this is due to unfamiliarity, but it's not like they are making it easy. With Apple, it is "Do things the Apple way, or not at all". But with Linux it is "sure whatever you say ¯\_(ツ)_/¯". If my Android phone is not displaying/silencing calls people go "weird, have you tried adjusting X settings?" But if my iPhone is not displaying/silencing calls an Apple person goes "well my watch tells me when someone is calling" and they do not understand how infuriating such an answer is. Yet, it is the norm.
I really do want to love Apple. They make beautiful machines. But it is really hard to love something that is constantly punching you in the face. Linux will laugh when you fall on your face, but it doesn't actively try to take a swing or put up roadblocks. There's a big difference.
[0] But there's been a big push the last few years to fix this and things have come a long way. It definitely helps that Microsoft and Apple are deteriorating, so thanks for lowering the bar :)
> With Apple, it is "Do things the Apple way, or not at all".
Well kinda, you don't have to use all that much Apple software on macs though. If you can live with the window manager / desktop environment then you can use whichever apps you choose for pretty anything else.
> The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager, you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
Mac laptop hardware is objectively better, but I am on the same camp as the parent post. For most development workflows, Linux is my favorite option. In particular, I think NixOS and the convenience of x86_64 is usually worth the energy efficiency deficit with Apple M.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
I develop using a MacBook because I like the hardware and non-development apps but all my accrual work happens on a Linux server I connect to. It's a good mix.
As a long term Mac user who works on ROS a lot I hear you. Most people here think local dev means developing a React app. Outside of mainstream web frameworks Mac sucks for local dev.
I kinda agree with the OP, but then I was a Linux user for well over a decade. I do think that C/C++ libraries are much, much more of a pain on Mac as soon as you go off the beaten path (compiling GDAL was not pleasant, whereas it would be a breeze on Linux).
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
I have to agree. The loss of sense of reality among Linux fanboys is really annoying.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
> Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation.
performance - I don't agree
battery life - absolutely
apps - absolutely
usability - I don't agree
innovation - I don't agree
One significant annoyance associated with Linux on a laptop is that configuring suspend-then-hibernate is an arduous task, whereas it just works on a Macbook.
But, the main thing is commercial application support.
I can relate. I've spent almost 30 years working primarily on Linux. I moved Windows to be under VM when I needed it around for occasionally using MS Office, first under vmware and later under kvm. Now I don't even use it as a VM, since work has Office 365.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
Speaking of the P14s, I have an Intel version from 2 years back and battery life is poor. And I hunger for the mac's screen for occasional photography. The other thing I found difficult is that there's no equivalent of the X1 Carbon with an AMD chip. It's Intel only. The P14s is so much heavier.
I thought the same thing when I saw the M5 in the news today. It’s not that I hate macOS 26, hate implies passion.. what I feel is closer to disappointment.
The problem is their philosophy. Somewhere along the way, Apple decided users should be protected from themselves. My laptop now feels like a leased car with the hood welded shut. Forget hardware upgrades, I can’t even speed up animations without disabling SIP. You shouldn’t have to jailbreak your own computer just to make it feel responsive.
Their first-party apps have taken a nosedive too. They’ve stopped being products and started being pipelines, each one a beautifully designed toll booth for a subscription. What used to feel like craftsmanship now feels like conversion-rate optimization.
I’m not anti-Apple. I just miss when their devices felt like instruments, not appliances. When you bought a Mac because it let you create, not because it let Apple curate.
Seconded. I have a mostly CLI setup and in my experience Nix favors that on Mac, but nonetheless it makes my Nix and Linux setups a breeze. Everything is in sync, love it.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
Really? This surprises me. I've used them for projects and for my home-manager setup and it's always been amazing at it. The best example I can come up with is packaging a font I needed into a nix package for a LaTeX file. It would have taken me a month of trying various smaller projects to know how to do that.
Honestly it helped quite a bit. There are a lot of obscure (imo) errors in Nix that LLMs spot pretty quickly. I made quite a bit of progress since using them for this.
Yes, macOS sucks compared to Linux, but the m chip gets absolutely incredible battery life, whereas the framework gets terrible battery life. I still use my framework at work though.
Yes, there is a dilemma in the Linux space. But is running Linux on a MacBook a viable option these days? Is Ashahi Linux solid enough?
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
Oh, I have that tab still open from when I was reading the other thread.
Here is the feature support from Asahi. Still a way to go unless you are on an old M1 looks like?
I've been forced to use Macbooks for development at work for the past 7 years. I still strongly prefer my personal Thinkpad running Debian for development in my personal life. So don't just put it down to lack of familiarity.
Try Aerospace. Completely solved window management for me.
Also for dev, set up your desired environment in a native container and then just remote into it with your terminal of choice. (Personally recommend Ghostty with Zellij or Tmux)
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
If you are on a M-series MacBook and aren't running a 3D Benchmark the entire time, your Mac is broken if it is dying after 2.5 hours.
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
8+ hours sounds about right. I have a M1 Macbook Pro and even 5 years later I can still use it (a million browser tabs, couple containers, messaging apps) for an entire day without having to charge it.
I get the comment about Docker. Not being able to share memory with docker makes it a pain to use to run things alongside mac, unless you have mountains of ram.
Hello! Yes! Writing this from my commute home using my companies M3 Pro and I hate it. I'm waiting for a new joiner so I can hand this off to a new starter who has a different brain to me.
I can write up all the details, but it's well covered on a recent linuxmatters.sh and Martin did a good job of explaining what I'm feeling: https://linuxmatters.sh/65/
They're basically things along those lines. They're more nefarious when background services quietly error out and you need to dig to find it was a newly required permission.
Launching unsigned apps is a problem, especially if an app bundle contains multiple binaries, since by default you need to approve exception for each of them separately.
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
There are a lot of annoying hurdles when allowing some types of application access. Needing to manually allow things in the security menu, allowing unrecognized developers, unsigned apps. Nothing insurmountable so far, but progressively more annoying for competent users to have control over their devices.
For me, who came from linux the only thing I don't like is the overview menu's lack of an (x) to close a window. The way slack stacks windows within the app so it's hard to find the right one. Pressing the red button doesn't close the app from appearing in your CMD+Tab cycle between apps, you also have to press CMD+Q. (Just a preference to how windows and linux treat windows, actually closing them. Rectangle resolved the snap to corner thing (I know MacOS has it natively too but it's not too great in comparison).
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
I think the hardest part for me, is getting used to using CMD vs CTRL for cut-copy-paste, then when I start to get used to it... in a terminal, it breaks me out with a different key for Ctrl+C. I got used to Ctrl+Shift for terminals in Linux (and Windows) for cut-copy-paste, etc.
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
I'm a lifelong Mac user, so obviously I'm used to using CMD instead of CTRL. Inside the terminal we use CTRL for things like CTRL-C to exit a CLI application.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
Ctrl+Shift+(X,C,V) tends to work for many/most terminals in Linux and Windows (including Code and the new Terminal in Windows)...
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
As a very long-term Linux user, I'm still aggravated when implicit copy and middle-click paste doesn't just work between some apps, since it is so deeply embedded in my muscle memory!
I'm only a recent MacOS user after not using it for over 20 years, so please people correct me if I'm wrong.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
Docker works very weirdly (it's a desktop application you have to install that has usage restrictions in enterprise contexts, and it's inside a VM so some things don't work), or you have to use an alternative with similar restrictions (Podman, Rancher Desktop).
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
Dovker on mac has one killer feature though: bindmounts remap permissions sensibly so that uid/gid in the container is the correct value for the container rather than the same uid/gid from the host.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
>The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
I suggest trying Nix on Macos, it is very nice as a package manager but also it can be used as a way to replace Docker (at least for my needs, it works very well).
This days I don't even bother installing brew on my Mac, I only use Nix.
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
I can tell you in one sentence: try to have a DNS server when mDNSResponder sits on port 53 (for example because you use the new virtualization framework).
And there are a lot of such things, which are trivial or non problem in Linux.
The issues I see people struggle with on a Mac is that development often needs things in a non-default and often less-secure setup.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
I also like the multi desktop experience on KDE more, but I‘ve recently found out you can at least switch off some of the annoying behavior in the Mac settings, so that e.g it no longer switches to another desktop if you click on a dock icon that is open on another desktop
I have a Macbook Air and I pretty much use it as an ssh machine. It is definitely over priced for that, but it at least beats the annoyance of having to deal with Windows and all the Word docs I get sent or Teams meetings... (Seriously, how does Microsoft still exist?)
Since I mostly live in the terminal (ghostty) or am using the web browser I usually don't have to deal with stupid Apple decisions. Though I've found it quite painful to try to do some even basic things when I want to use my Macbook like I'd use a linux machine. Especially since the functionality can change dramatically after an update... I just don't get why they (and other companies) try to hinder power users so much. I understand we're small in numbers, but usually things don't follow flat distributions.
> I had to split all my dot files into common/Linux/Mac specific sections
There's often better ways around this. On my machine my OSX config isn't really about specifically OSX but what programs I might be running there[0]. Same goes for linux[1], which you'll see is pretty much just about CUDA and aliasing apt to nala if I'm on a Debian/Ubuntu machine (sometimes I don't get a choice).
I think what ends up being more complicated is when a program has a different name under a distro or version[2]. Though that can be sorted out by a little scripting. This definitely isn't the most efficient way to do things but I write like this so that things are easier to organize, turn on/off, or for me to try new things.
What I find more of a pain in the ass is how commands like `find`[3] and `grep` differ. But usually there are ways you can find to get them to work identically across platforms.
> Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
Yes, with major tradeoffs. Asahi Linux is an amazing project, but they have not yet figured out how to get anywhere close to a Mac's power efficiency when it is running MacOS. For example, you will lose a lot of battery life[0][1] with the lid closed, whereas on MacOS you lose pretty much nothing.
macOS has a different dev culture than Linux, but you can get pretty close if you install the Homebrew package manager. For running LLMs locally I would recommend Ollama (easy) or llama.cpp. Due to the unified memory, you should be able to run larger models than what you can run on a typical consumer grade GPU, but slower.
For me a VM set up via UTM works quite well on my Mac. Just make sure you do not virtualize x86, that kills both performance and battery life. This way I get the nice battery life and performance in a small packge but am not limited by MacOs for my development.
I suggest you head over to /r/unixporn, and you'll probably be presently surprised. Contrary to popular belief, most of this stuff is not very hard to setup. Of course, there are people also showing off custom tooling and the craziest (and sometimes silliest) things they can pull off, but a beautiful interface is usually something you can do in under an hour. It is customary to include a repo with all configurations, so if you wanted to direct copy paste, you can do it much faster than that.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
Yeah, I am using a 2020 Macbook Air M1 with macOS 15.7.1 (which I am about to install Ashasi Linux) and I have no issues as a casual user. For most people who use macbooks I see no reason to but an M5 or M4 over an M1.
One of the problems is that I don’t notice a meaningful difference(that’s worth the money) between my M1 and my M4 workloads. (Dev/video). Obviously the rendering is faster but the OS isn’t. Tahoe makes my M2 feel like an intel mac.
Chip, memory and storage are really fast, but I’m fully convinced that the OS is crippling these machines.
> Sorry you made your first gen chip so good that I don't feel the need to upgrade
M1 MacBooks are ~5 years old at this point, and if you've been working a laptop hard for 5 years it's often worth getting an upgrade for battery life as well as speed.
frankly nothing holds a candle to the battery performance of the M series machines so it’s likely a safe bet to assume that advantage will also translate into longer overall life/battery health until we see otherwise. We’ll see in a few years I suppose.
I felt the same way about the battery in my 2018 MacBook ... it was losing capacity, but I didn't mind as it still ran for hours between charged.
Then it started having issues waking up from sleep. Only the OG Apple charger could wake it up, then it would see it actually had 40-60% battery but something had gone wrong while sleeping and it thought it was empty.
Intel MacBooks had terrible SMC issues, so maybe this won't afflict the M-series. Just sharing because I could still use that MacBook a few hours between charged, it just couldn't be trusted to wake up without a charger. That's really inconvenient and got me to upgrade combined with new features.
How much would you charge me to swap out my MacBook Pro 2018 15.4" battery using authorized methods to not cause other damage? I want my laptop back within a few days, a 90 day warranty on parts and labor, and I want a genuine Apple battery - not some unknown 3rd party.
I got that as well.. more annoying are comparisons with the last Intel options, which sucked then.
I'm still doing fine with a 16gb M1 Air, I mostly VPN+SSH to my home desktop when I need more oomph anyway. It lasts a full day, all week when you just check email on vacation once a day.
In terms of performance, thermals, and battery life, it was a huge upgrade for me when I moved from Intel to the M1 Max. M1 Max to the M4 Max... improvements were mainly on very heavy tasks, like video transcoding with Handbrake or LLMs with Ollama.
I'm guessing carriers/networks can't handle a fleet of MacBooks-with-cellular yet. The data workload would be sustained and intense with macOS not having the type of system-level cellular framework/data control as iPad and iOS (I have used the low data mode on macOS, it helps but only handles a small part of the problem).
I have bought cracked-screen iPhones since Personal Hotspot allowed wired connections back in the 2000s, velcro'd them to the back of my MacBook screen and have been living the "I have internet on my Mac everywhere" life since then. With 5G, I can't really tell when I'm on Wi-Fi vs. when my MacBook opts for the hotspot connection.
I'd love a cellular MacBook and would also insta-buy, but I've given up hope until the next network upgrade.
That doesn’t make much sense to me, there are literally billions of phones that people are using all the time.
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.
> (I have used the low data mode on macOS, it helps but only handles a small part of the problem)
Yes, I mentioned that in the post you responded to.
> Not sure which apps, if any, respect it, but it's there
It reduces data consumption for me about 1/5. Not nothing, but the Mac can easily consume hundreds of GB of data a week doing "normal" activities. YouTube on a MacBook is many times more data than the equivalent on a phone screen.
I've heard (but not tested) that Tahoe and iOS 26 do a _much_ better job of auto-connecting and reconnecting (if your cell drops, like going through a tunnel or similar) to make it easier to use your phone with your MBP.
I hope this is the case. I don't know if I would buy a cellular MBP (just wouldn't use it enough) but better tethering is a huge win for me.
I know minis don't sell well but I wish they kept the Air 11" format but without the bezel one way or another
My craving has been answered by the GPD WIN MAX 2, a 10" mini laptop with lots of ports and bitchin' performance (AI 9 chip sips battery). It's windows, but an upgrade to pro to disable the annoying stuff via group policy + never signing into a Microsoft account, it's amazing how much faster it is than a machine that's always trying to authenticate to the cloud before it does anything. Wake from sleep is also excellent which was the main thing that kept me using MacBooks. Anyway it's the first computer I've bought in a decade that has some innovation to it.
Edit: there's a slot for a cellular modem but I haven't done enough research to find one that will play nice on US networks
Why carry around two cellular modems? Are you ever out and about with your computer but not your phone? I've been happy to hotspot my computers and tablets to my phone, which I always have with me.
The only possible issue I can think of is battery life, but if I'm carrying around my laptop I can throw a charge cable in the bag to keep my phone juiced.
I want my computer to have an always on cell modem just like my phone does.
The Apple Silicon chips all run in a version of always on these days because the efficiency cores are so, well, efficient.
Additionally, while you may want to burn the battery in multiple devices and deal with having to manage that, I don’t want to.
Apple has been selling cellular iPads since the beginning and I love never having to worry about pairing mine.
Tethering to an iPhone or iPad Is much better than it used to be, but it’s still not perfect.
Apple makes their own modems these days and even with Qualcomm had a capped per device license fee more than covered by the premium they charge for cellular in, say, the iPad.
I know so many people who want this convenience and are willing to pay for it that it just seems like stubbornness at this point that they’re willing to put modems in iPads and not MacBooks.
I leave my setup plugged in, using a low-profile USB-C to lightning cable on the iPhone SE stuck on the back of my screen and wired hotspot on macOS is a great experience.
We're discussing a MacBook someday with a built-in phone, the closest I've found is an iOS device wired to my MacBook as a wired hotspot. It's like having fast wifi everywhere.
Using my personal phone (that I also use for other things like calls) wouldn't be like having wifi everywhere on my Mac, for example if I walk away from my laptop while on the phone the Mac would lose internet.
The pairing has become almost flawless as well. Years ago, it was slow and inconsistent, but now the hotspot feature is almost perfect and automatic. Honestly, I don’t really think about it anymore.
MVNOs FTW. They know they're competing for price-conscious consumers so have to offer more value. The big 3 know most of their customers are going to go with one of the big boys, all of whom are expensive and not great.
MVNOs have slower data rates since they buy deprioritized traffic in bulk, don’t have the roaming agreements domestically and especially not internationally, and don’t offer unlimited high speed data.
Based on the type of responses you are giving, I actually do believe you probably call your phone company's customer service regularly. So perhaps your criteria might be different. Have you heard of Consumer Cellular?
Removing the charger is a good move, in my opinion.
USB-C chargers are everywhere now. Monitors with USB-C or Thunderbolt inputs will charge your laptop, too. I bought a monitor that charges over the USB-C cable and I haven’t use the charger that came with the laptop in years because I have a smaller travel charger that I prefer for trips anyway.
You don’t have to buy the premium Apple charger and cable. There are many cheap options.
I already have a box of powerful USB-C chargers I don’t use. I don’t need yet another one to add to the pile.
The battery is good enough that I often travel with just my phone charger. I can plug the laptop in at night when the slow charge rate isn't a hindrance and be fine with the all day battery life
I've got several 50+W chargers from other devices (old mbp, soldiering iron, a generic one). If you don't have a high power charger, buy one. Easy enough. But there are plenty of use that don't need another.
Every time I’ve gotten rid of an old laptop the charger goes with it. They are a package deal in my book.
I actually have very few USB-C chargers. With everyone leaving them out of the box, I don’t happen to have a bunch of them by chance. They took them out of the box before giving time for people to acquire them organically. I never bought a single lightning cable, but almost all my USB-C cables had to be purchased. This is not great, considering how confusing the USB-C spec is.
Other than the one that came with my M1 MBP (which I will lose when I sell it), I have had to purchase every charger I have.
Not being able to charge a $1,500+ laptop without buying a separate accessory is crazy to me. I’ve also seen many reports over the years comparing Apple chargers to cheap 3rd party ones where there are significant quality differences, to the point of some of the 3rd party ones being dangerous or damaging. I don’t know why Apple would want to open the door to more of that.
I assume a lot of people will use a phone charger, then call support or leave bad reviews, because the laptop is losing battery while plugged in. Most people don’t know what kind of charger they need for their laptop. My sister just ordered a MacBook Air a couple weeks ago and called me to help order, and one of the questions was about the charger, because there were options for different chargers, which confused her and had her questioning if one even came with it or if she had to pick one of the options. This is a bad user experience. She’s not a totally clueless user either. She’s not a big techie, but in offices she used to work with, she was the most knowledgeable and was who they called when the server had issues. She also opened up and did pretty major surgery on her old MacBook Air after watching a couple YouTube videos. So I’d say at least 50% of people know less than her on this stuff.
Apple positions themselves as the premium product in the market and easy to use for the average user. Not including the basics to charge the internal battery is not premium or easy. I can see it leading to reputational damage.
I have some 50+W chargers from old devices also. However, they are much, much bigger than the current ones. Doesn't matter for when my computers is plugged at home, but I wouldn't want to travel with it since it's easily 3x the size/weight.
My MacBook M1 Pro w/ 441 cycles started doing a fun thing where if the battery gets under about 50% and you put it to sleep, the ONLY way to power on the device is to use the exact charger it came with. Higher powered Apple Studio Display PD, or even good 3rd party chargers, do not bring it back to life. This occurs even when the battery has 40-60% remaining if the laptop goes to sleep.
Had a similar issue with my 2018 MBP Intel - the 86/87 Watt Apple charger was the only thing it would come to life with as the battery aged if the device got too low.
My dad had similar trouble with an M1 MacBook Pro that got a depleted battery. Two chargers he had wouldn’t work, but fortunately the Anki charger that I used for my laptop did work with one of the cables that I had (though not another). Once it got a little juice into it, then it was fine and he could switch back to his. But I think he was a bit more careful to avoid total depletion after that.
In 2018 I had a phone that entered a boot loop: battery depleted, plug it in, it automatically starts booting, it stops charging while booting, it dies due to depletion, it recognises it’s plugged in and starts charging, boot, stop, die, start, boot, stop, die… I tried every combination of the four or five cables that I had with a car USB power adapter and someone’s power bank, nothing worked. Diverted on my way home (an 8 hour journey) to buy another phone because I needed one the next day. When I got home, I tried two or three power adapters with all the cables and finally found one combination that worked. I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
> I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
The solution is to keep your devices charged. This is feasible if you have a few devices. Not practical for someone like me. I have too many devices. I don't use every device daily.
Yes, I don't often let batteries deplete but the issue I'm having on my MacBooks is that they will "die" with plenty of charge (often 40-60% left). But the computer thinks it is at 0% and won't boot past the "plug me in!" screen with anything except the OG charger from the Apple box. As soon as you connect the OG charger, it boots automatically and you see 0% battery go to 40-60% battery. At this point, you can uplug the macbook and use it - as long as you don't put it to sleep. Obviously battery/power related, but the only fix is using the charger that says/does exactly what my MacBook wants. I wonder how Apple handles this on the M5.
In my experience a low-power charger will revive, you just must wait for it to hit enough SOC since it is effectively starting off the battery. This does take a while, but starting dead on a supply that can't guarantee enough power would be dumb.
My M1 Pro with 441 battery cycles won't power back on without the Apple charger it came with if I close the lid or put it to sleep and the battery isn't over 60%... something happens and the computer goes into a sleep state where the battery doesn't drain but no charger except the OG brings it back to life.
Even a Studio Display, which can provide more power than my M1 Pro can use, won't wake it from this state. Apple wants $300 for a replacement battery so I'll just buy a new MacBook at that price, but the charger situation doesn't bode well for M5 MacBook buyers who wonder why their Mac is dead one day (and they just need the exact charger the system wants, but Apple didn't provide it)
Replacing a MacBook battery is a lot of delicate work. Not everyone has steady hands, great eyes, etc. For $300, the Apple Store is a better deal for most (and guaranteed to be a quality battery with warranty) compared to the difficulty of a $110 battery kit.
I don't want to use a 3rd party battery in a device I carry with me most places I go...
When I buy a new laptop and sell my old one, I either have to sell the old one with the charger or keep the charger and sell the old one without. I don't actually have a bunch of spare chargers capable of charging a laptop (phone, sure).
This is especially true for someone moving up to an MBP from an MBA, which takes less juice.
I think it's awesome.
I have way to many chargers, specially USB-C, 5, 10, 20, 35, 70, and 95W all over the house and office.
If you need one, just shell the extra $100 that corresponds to your needs.
The real crime is that it starts at 1799 euros, which is $2100, vs $1599 in the US, I know US prices are before tax but even with 20% VAT you're far off...
Apple overprices everything in the EU on top of not shipping new features. Currency risk is a thing but nowhere near the premium they charge. I personally vote with my wallet and stopped buying anything from them.
Those regulations don't prevent apple from shipping anything, they prevent companies from abusing their users. Apple is free to ship without abusing anyone, but explicitly choose otherwise.
ASUS’s ZenBook Duo (UX84060) is over 50% dearer in Australia than in the USA.
When it was announced, I expected it to be at least 4000 AUD (~2600 USD). When I heard it was starting at 1500 USD instead (~2300 AUD), I was astonished and very excited. And it still is that price… but only in the US. In Australia it is 4000 AUD (the 32GB/1TB model, which is 1700 USD, ~2600 AUD). So I sadly didn’t get one.
Is the rest of the world subsidising the US market, or are they just profiteering in the rest of the world?
> Under EU rules, if the goods you buy turn out to be faulty or do not look or work as advertised, the seller must repair or replace them at no cost. If this is impossible or the seller cannot do it within a reasonable time and without significant inconvenience to you, you are entitled to a full or partial refund. You always have the right to a minimum 2-year guarantee from the moment you received the goods. However, national rules in your country may give you extra protection.
> The 2-year guarantee period starts as soon as you receive your goods.
> If a defect becomes apparent within 1 year of delivery, you don't have to prove it existed at the time of delivery. It is assumed that it did unless the seller can prove otherwise. In some EU countries, this period of “reversed burden of proof” is 2 years.
I think you read him backwards. It’s still cheaper in the US. Tariffs certainly exist in Europe but I’m unaware of any on these laptops and US Tariffs on goods from China don’t apply to goods from China to anywhere else that isn’t the US.
Macbooks shipped to Europe don't ever touch US ground (and I'd wager 99.9% of their parts don't either). So US tarriffs should be irrelevant - and the EU doesn't have big China tarriffs outside of EV and solar panel anti-dumping retaliation.
I have a strong feeling Apple is raising prices elsewhere in order to avoid pissing off the notoriously sensitive consumers in America. Sony is doing similar things with making the PlayStation expensive everywhere to make it affordable for Americans. The world is essentially subsidizing the tariffs for Americans.
People in other countries will get pissed but ultimately suck it up and buy a product. People in America will take it as a personal offense due to the current Maoist-style cult of personality, and you'll get death threats and videos of them shooting your products posted onto social media. Just look at what happened to that beer company. No such thing would happen in Germany.
Many people got played on the charger thing. It’s never free, it’s a mandatory bundle. But companies only put one line item on the receipt, never refer to the primary component separately but instead conflate its name and idea with the bundle, and when forced to de-bundle (usually) bump the primary component’s price to compensate, and people buy it: “the EU took away my charger!”
Chargers don’t change quickly. If I lost my charger from 2019, the ideal replacement in 2025 would be literally exactly the same model—and mine still works like new and looks good. I have nothing to gain from buying a new charger.
We should be cheering the EU for ending an abuse that the US has long failed to.
Also, it still bundles a USB-C to MagSafe 3 cable.
If you sell your old laptop when you buy a new one, you generally sell it with old charger. And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
There's a reason they generally make sense to bundle. Especially with laptop chargers, which provide a whole lot more power than some random little USB-C charger you might have. Sometimes letting the free market decide actually gives customers what they want and find most useful.
> If you sell your old laptop when you buy a new one, you generally sell it with old charger.
Sounds like a symptom of incompatibility. I’ve only ever included the charger when it was specific to the laptop.
> And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
Chargers automatically provide whatever power level is needed, up to their max, and charging power isn’t the steady tick upward we’re used to elsewhere. The MacBook Pro did get a faster charger a few years ago, relegating old ones to that “compatible but not optimal” state, but meanwhile MacBook Air chargers got slower, and most releases didn’t change the charger. Certainly there are sometimes benefits to buying a new charger, but it happens much less often than new device purchases, and even when there are benefits purchases should still be the customer’s choice.
> Sometimes letting the free market decide actually gives customers what they want and find most useful.
I agree, but “free market” doesn’t mean lawlessness, it means an actual market that’s actually free. Actual market: companies compete on economics, not e.g. violence or leverage over consumers. Actually free: consumers freely choose between available options. Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
> Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
Only when there's no competition and you can use that to abuse market power.
But competition for laptops is strong. Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store. The market prefers it when they're bundled.
Don't forget the environmental impact of a smaller box. The box will probably be less than half as thick, doubling shipping efficiency. These are air freight, so the CO2 impact is not negligible.
I'll take the discount and use one of my 12 existing USB-C chargers.
There are more 90W-capable USB-C chargers in my home than there are laptops. I certainly don't need another one.
Honestly I'd be fine for them to just remove the box altogether and use paper envelope like Steve Jobs did once.
>Don't forget the environmental impact of a smaller box
Compared to the marginal environmental impact to source materials, build hardware and parts, assemble, ship, stock, and transport to customer each unit, the box could be 10x larger and it wouldn't make a dent.
> ship ... the box could be 10x larger and it wouldn't make a dent
This is not how shipping works.
A larger box, even by 1 inch on any direction, absolutely makes a huge difference when shipping in manufacturing quantities. Let's not pretend physical volume doesn't exist just to make an argument.
10 planes flying with MacBooks == much different than 1 plane (in other words, when you 10x the size of something, as you suggest, it does actually have a huge impact)
The point being made is "it's not the paper fr the box that's the issue".
A smaller box allows more to be carried. But if we go that route, it's trivial to ship them without any box and box them domestically - and that's a 2-3x volume reduction right there.
> it's trivial to ship them without any box and box them domestically
Ah yeah I can't imagine any scenario where this could go wrong
Like man in the middle attacks
Replacement/fake products
... or you know, damage? Boxes provide... protection.
> it's trivial
Anytime you catch yourself thinking something is trivial, you're probably trivializing it (aka think about it more and you'll probably be able to think of a dozen more reasons packaging products is the norm)
Dropping the price was nice. They could have gotten away with a slight reduction in price, and a coupon inside to send away for a "free" charger, and then bask in the millions who couldn't be bothered to do it.
What really bugs me, is the huge performance gains are against the M1 and an (5-7yo chip) Intel Mac, that from my own memory had throttling and overheating issues. While not as impressive, I'd really appreciate if they simply showed the generational gains, or actual charts against several previous generations.
I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
You can't expect Apple to make an argument against their own chips... you're asking them to admit that they are making ~20% a year improvements when they want buyers to think it's a multiples-of-X improvement.
> How did we already get to no-one being impressed by 20% better PER YEAR already.
When has 20% been impressive? When Intel to M1 happened, the jump was huge ... not 20%. I can't think of anything with a 20% jump that made waves, even outside of tech.
When I used to do user benchmarking, 20% was often the first data point where users would be able to notice something was faster.
4 minutes vs 5 minutes. That's great! Kind of expected that we'll make SOME progress, so what is the low bar... 10%? Then we should be impressed with 20?
People aren't upgrading from M1, M2, M3 in numbers... so I don't think it's just me that isn't wow'd.
No WiFi 7 and WiFi 6E only is annoying. Especially for what they are charging. And Bluetooth 5.3, Their Pro Mac are slower than their iPhone Pro.
SSD has double the speed. I guess they say this only for M5 MacBook Pro, because the previous M4 has always had slower SSD speed than M4 Pro at 3.5GB/s. So now the M5 should be at 7GB/s.
The non-pro/max chipped MBPs have always been a little 'lower spec' in several regards. There used to be a little more separation though, with the non-pro chips available only in the Air & 13" MBP, but back then people complained about Apple having 'too many models'...
I suspect the M5 Pro/Max chipped MBPs will bring some of these improvements you're looking for.
With NVMe NASes and 5Gbit, 8Gbit and 8Gbit FTTH available for reasonable price in many places, it's easy to saturate any WiFi connection by just downloading stuff (games, AI models, etc), backing up files, or accessing your files on NAS (and editing videos straight off NAS is recently trendy).
Anyone know when to expect the M5 Pros? I am on a base 16gb M1 and struggling hard in daily workloads. I am often running at 20gb of swap memory usage.
I don't really use local LLMs but think 32GB RAM would be good for me... but I am so ready to upgrade but trying to figure out how much longer we need to wait.
First rule of mac world is get the most memory you can afford.
I got the cheapest m1 pro (the weird one they sold thats binned due to defects) with 32gb ram and everything runs awesome.
Always get the most ram you can in mac world. Running a largish local LLM model is slowish but it does run.
A mac out of memory is just a totally different machine than one with.
probably because most of the devs building the software are on the highest ram possible and there is just so much testing and optimization they dont do.
Part of me misses my OG base 14" M4 Pro. The battery on that thing was absolutely phenomenal - literal 12-14+ hours of real-world use. Not so much on the 14" M1 Max (64GB) that I upgraded to after about 2 yrs.
'Real-world idle' efficiency on the newer chips is the main reason I've got the (slight) itch to upgrade, but 64GB+ MBPs certainly don't come cheap.
This gap makes no sense to me. I wonder if Apple is just leaning into this cycle because it's easier to make M5s than more advanced processors, so you can sell this sooner?
From a buyer's perspective, I don't like it at all.
M series chips are ridiculously massive, as Apple apparently does not want to transition to chiplets, so they can’t easily compose CPU. Thus refining the process and improving yields on the smaller parts probably makes sense.
As an other example the current ultra part is the M3, and it was released early 2025, after even the M4 Pro/Max, and a good 18 months after the M3 was unveiled. We might not see an M4 Ultra until 2027.
Different Chip SKUs are often a TON of work. By trying to release all of them at the same time, you'll have a chip pipeline where you need tons of work, all at the same time, all in the same stages of the process. By staggering them, you spread this work out across the year.
I play absolutely everything on my M1 Macbook Pro. Through Crossover, basically every Windows game runs fine. I used to check ahead of time before buying a game, but it's so good I now kind of just assume games work.
A NVIDIA 2080 graphics card from 2018 still surpasses the M5 for gaming. The M5 Pro coming early next year will likely finally catch up with the 8-year-old 2080.
I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles. Most games released in the last year or two don't run well on my 2080 test system at anything approaching decent graphics.
A 2080 is about the same performance as a 5060 and every game is going to be able to run on a 5060. You might not be running it at 4K Ultra with ray tracing enabled but you should be able to run at like 1080p High or better.
Whether or not the M5 GPU is actually capable of that level of performance or whether the drivers will let it reach its potential is of course a completely different story. GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
A 5060 outperforms a 2080 by roughly 20% on most titles, across the board, not cherry-picking for the best results. They are not about the same.
> you should be able to run at like 1080p High or better
This is disconnected from reality. 1080p low/medium, some games are playable but not enjoyable. Remember, I actually have a 2080, so I'm not just guessing.
> GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
Rich coming from someone who claims a 7 year old graphics card is "about the same" as a card which has 2.5x better RayTracing, has 3x faster DLSS, faster VRAM, and much better AI capabilities. The 2080 can't even encode/decode AV1...
I would guess they are mostly talking to game devs for now, but man in a few years if Apple can get me to throw out my windows rig that me and I imagine many others have around just for gaming I wouldn't hate that!
The gaming world is so deeply ingrained with Windows technologies. Even with the GPK from Apple, I don't see the mods and patches that some Windows players enjoy.
I do almost all of my gaming on an M3 MacBook Air. It’s great for games. I’ll sometimes hop on Windows for titles unavailable on the Mac, but increasingly I just skip them if they aren’t on Steam for MacOS.
I get that it is good for some games, but when people say "gaming PCs" on Windows, they usually mean AAA titles. The stuff on endcaps at BestBuy for sale for PC and console. Those games won't run well on Macs unless you spend insane amounts on a Max or Ultra variant.
The M5's GPU specs seem to put it near a high-end NVIDIA card from 2018. Impressive as all get out for a power-friendly chip, but not really what I think of when I hear "good for gaming"
Am I remembering right that the previous 14" MacBook Pro started at $1399 (and seems to be no longer available?), so this is a $200 price increase?
(I had just been looking at macs a few weeks ago, and had noticed how close in price macbook pro and macbook air were for same specs -- was thinking, really no reason not to get pro even if all I really want it for is the built-in HDMI. They are now more price differentiated, if I am remembering right).
Good question, although I don't think they would price them differently because the Trump administration has openly signaled hostility toward private companies who would transparently pass on tariff costs and Apple has openly signaled subservience to the Trump administration.
The 16" doesn't offer the M5(yet), rather the M4 Pro and Max CPUs. Difference also is higher number of performance cores vs efficiency cores and memory bandwidth is significantly higher in the M4 lines(273 and 410 GB/s) versus the M5(153 GB/s).
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
Everyone buying their high end gear is buying something waiting to be refreshed now.
Hasn't that been the case throughout the industry for the last two decades now? Back when Intel was still on TikTok, the low powered laptop chips were always first, then mainstream desktop, then workstation and server roughly a year delayed. Maybe mistaken, but seems to make sense, if you mainly offer monolithic chips, you'd want to start with a smaller Die size to better leverage the process.
AMD is somewhat of an exception/unique case though, having chipsets and monolithic depending on the use case and console/semicustom offerings, so that doesn't map fully.
Also, let's not forget in Apples case, that they actually go phone first, the Air+iPad, then Pro and finally Studio. Feel that the lower end devices should priority personally though, efficiency gains are more valuable in connected devices with limited space for batteries over my 16 incher with 100wh.
Course, would be nice if we just got the entire range updated at once, but I doubt even Apple could pull such a supply chain miracle off, even if they bought all of TSMC and the entire island to boot...
+1 on this... it also gives them more opportunity to work out any issues when piecing the larger chips together while catching any post-production issues on the simpler, lower end parts before they hit bigger customers.
Well here’s the thing, M5 is (probably) a big A19, M1 was a big A14, and so forth. The whole thing of apple silicon is that it’s large phone chips (optimized for performance per watt) rather than small workstation chips (performance at all costs)
Is this because they know some whales that want the +1 model are going to jump at the opportunity to buy whatever is on the market, and then buy again when higher-range models are released?
Apple knows their sales numbers, so I imagine is that they know the base model will sell the most quantity. Having it out now means more sales at the highest MSRP before talk of the next model release.
Buyers who walk into an Apple store for a base MacBook Pro will wait if they hear a new model is coming out. So if you have a buyer basing purchases on the generation number, it makes sense to launch that model as soon as possible.
Pro/Max buyers generally are checking into specs and getting what they need. Hence the M2 Ultra still being for sale, for some niches that have specific requirements.
It takes 3 months to manufacture a chip end to end. If you find a hardware bug once complete, you have to start over and you end up with a three month delay.
Looks like the Pro and Max will be on a three month delay.
If you find a hardware bug that late, you're not fixing it and you're looking for chicken bits to flip. It is /extremely/ expensive to validate changes and remake the masks.
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
People in the U.S. are starting to think about their Christmas shopping lists right about now.
pro and max had way more cores and gpus and supported way more ram. today's release is the basic version of the new cpu; if you want more ram you can get the m4pro or m4max based MacBook Pros, or wait for the M5pro/max to come out.
so, macbook pro m5 normal, macbook pro m5 pro, macbook pro m5 max. I see M4 still in the offer, which costs more. No wifi 7 on m5 (normal or not). 32GB these days, unified or not, in a pro machine.. I haven't paid much attention to macs over the past few years but I wonder what would Steve Jobs say about this shit.
I've been a Windows fan forever, but the new Mac hardware is making it hard to remain and it's about time for a new laptop... can't get a good Windows installed on these chips like you could on the Intel-based ones, only virtualized.
Base m1 was like 4-5 years ago. did we have that much ram with oldest macs too, 30 years ago? 4-5 years at same base RAM is incredibly cheap behaviour from Apple. Some phones are literally close to that RAM now.
You could (unsupported) run 16gb ram on 2010 rMBP models, back before it was soldered on. Worked great, not to mention swapping the spinning drive for an SSD.
At this point, I get the soldered on ram, for better or worse... I do wish at least storage was more approachable.
The Pro sales page says their RAM is unified, which is more efficient than traditional. Anyone have a concept by how much more efficient unified RAM performs vs RAM?
Their sales copy for reference:
"M-series chips include unified memory, which is more efficient than traditional RAM. This single pool of high-performance memory allows apps to efficiently share data between the CPU, GPU, and Neural Engine.... This means you can do more with unified memory than you could with the same amount of traditional RAM."
The fact that it's unified just means it's shared between CPU/GPU usage which can be a good thing. A lot of the performance comes from more channels and a more stable distance from the CPU itself... Getting really fast performance from RAM is more difficult with a detachable interface in the middle, and longer traces.
Still not the fastest ram, that they use for dedicated GPUs, but faster than most x86 options.
It just means it's shared between GPU and CPU. Has its advantages in specific workloads, but dedicated super fast GPU RAM usually is better.
Everything else in this statement in marketing bullshit and Apple trying to look like they invented the wheel and discovered fire.
Well, they could continue the 8gb joke so let's appreciate the fact that they finally switched to 16gb base models (and similarly stopped the 128GB SSD madness, these models were outdated when bought).
For perspective I have a 12 year old MacBook with 8 gigs of ram and it’s still perfectly usable for all the things I do on it. If you need more RAM because you are video encoding, compiling, or gaming (why!?) then you aren’t a basic consumer.
I’m not trying into be a fanboy and maybe it’s a little bit “cope”, but apple has always put as much RAM as is necessary for the computer to work—and not a lot more—in their base models.
> I know it’s silly, but I think I represent over 90% of apples customers in that way
You're not silly, you're just able to see reality.
Apple knows who is buying the bulk of their computers, and it isn't power users ... most people buying computers don't have a clue what RAM is even used for.
I'd hit beachballs, but macOS balances 8GB of RAM fine even with Tahoe for regular users
And a "pro" computer that comes with half a tb of storage by default with a $200 premium for another 0.5 tb of storage. Oof. Just gross.
I know people complain at every release. But I look at the three choices presented and they are all disappointing to me. It's a huge turnoff to see the only initial differentiator presented to be a choice between "measly" amounts of RAM and storage to "barely acceptable" amounts.
To get even close to the specs on my Surface Pro I'd have to hit the configurator and spend at least $1000. Even more to hit the config of my work issued HP notebook.
Probably great, but when the hell are they going to do another damn colour. Hoping by the time I upgrade from the M4 Pro they'll have a green version and a cell modem.
- m4 -> m5, same core number and distribution, "neural accelerators", higher memory bandwidth
- max storage increased from 2 to 4TB (and probably an extra kidney in price)
Everything else is strictly identical.
The marketing blurb claims 2~3x the performances of the M4 for AI stuff (I assume that's the neural accelerators), 20% for CPU tasks (compiling), 40~80% for GPU tasks (gaming and 3D rendering).
I’m comparing comparable configurations. You can get 128GB ram and 8TB storage on an m4 max, and 512GB ram and 16 TB storage on an m3 ultra. Neither is relevant to the M5.
TBH, I have an M2 pro (personal) and an M4 Pro (work) and I have never been able to tell the different in day to day use. That said, the only really intensive workload I have is photography/videography batch post-processing and I haven't tested that on my work machine. I'm disappointed Apple hasn't published any benchmarks comparing M5 to various M4 (or M3) variants.
Especially considering the M5 iPad pro has WiFi 7.
Sounds like maybe they didn't want to try and fit their new N1 chip this go around so they could re-use some components? MacBook still has the same broadcom chip. Or for a pro differentiating feature when the M5 Pro/Max comes out later. There's a rumored MBP re-design, so I'm guessing we'll see it then along with it having the N1 for WiFi 7.
I run our office IT, and WiFi 7 is just better at managing congestion. We have a floor in a busy building and 5Ghz is chaos. 6E is fine, it's just strangely old for a company like Apple.
Sad to release without refreshing the high end line on 16” MBP. I worry they nerfed the 14” MBP to ensure the M4 16” retained better specs to not make the discontinuity worse. Otherwise the 14” outperforming the more expensive 16” would be uncomfortable.
Isn't the screen size difference basically enough between these two? I can't see why the 16" would need more performance, some ppl just want / can carry large computers with them while some other prefer to have something as small as possible.
It’s more a statement on price and the assumption the more expensive one with the “more capable” chip like the MAX would be expected to not be less performant than any in the lineup. It would be a disappointment, especially for me as I’m about to buy a 16” in November regardless, to be a generation behind while paying more, and it would be not unusual for product reasons to nerf the lower prices chip to ensure it didn’t canabalize the more expensive models sales.
I have to say if I had any choice I would delay my purchase until the 16” catches up rather than buying a generation behind. If I see specs saying M5 14” is more performant for my workloads than my more expensive 16” I’m even more motivated to delay. Most product managers would be aware of these things.
I can see why that sounds sensible, but my personal obsevations are that heavy duty power users almost universally prefer the bigger screens, and those people also want the highest level settings. Most people I know who want smaller screens are not serious power users.
I can see an overlap with people who want smaller computers who also want max power, but I just would not believe that is a significant group. (again, all personal observations)
More pixels? Thats the only reason I can think of. 13/14 inch is what I tend to go for since I use my laptop as a desktop 80% of the time. 16 is really too big for my needs.
I also think the 15 inch MacBook Air filled the non-power-user-but-likes-big-screen niche.
Does it have full sized cursor keys? PgUp/PdDown/HomeEnd keys? If not, its a fashion accessory, its definately not a PROductivity machine used by PROffessionals.
To any Linux users, I recently bought a fully loaded M4 MacBook pro to replace my aging Lenovo and strongly regret it. I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up. I'll probably replace it with a framework at some point in the near future.
Edit: okay, that garnered more attention than I expected, I guess I owe a qualification.
1. Everything is just slightly different. I had to split all my dot files into common/Linux/Mac specific sections. Don't expect to be able to clone and build any random C++ project unless someone in the project is specifically targeting Mac.
2. Not everything is supported natively on arm64. I had an idea and wanted to spin up a project using DynamoRIO, but wasn't supported. Others have mentioned the docker quirks.
3. The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
So my person takeaway was that I took the openness of the Linux ecosystem for granted (I've always had a local checkout of the kernel so I can grep an error message if needed). Losing that for me felt like wearing a straightjacket. Ironically I have a MBP at work, but spend my day ssh'd into a Linux box. It's a great machine for running a web browser and terminal emulator.
>I thought I would use it for playing with LLMs, but local dev on a Mac is not fun and I still don't have it fully set up.
Sounds more like a you problem, probably due to unfamiliarity. There are endless options for local dev on a Mac, and a huge share of devs using one.
I've used Macs for 20 years starting on the day 32-bit Intel Macs were released, and agree with the GP. Linux and Plasma spoiled me, going back to macOS and its windowing system feels like a step backward, especially for development, where using multiple windows is a must. Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
Yes, I know about Yabai and the other things that modify the existing window manager. The problem is the window manager itself.
Outside of the windowing system, running native Linux if you're deploying to Linux beats using an amalgamation of old BSD utils + stuff from Homebrew and hoping it works between platforms, or using VMs. The dev tools that are native to Linux are also nice.
When it comes to multiple monitors, I want a dock on each monitor. I can do that in Plasma, but I can't in macOS, unless I use some weird 3rd party software apparently.
> Task switching is.. not good? I don't get window previews I can switch through when I hover over the dock, but I do on Linux.
That just sounds like being accustomed to one way of switching tasks, honestly. If I want previews, I use Expose (three-finger swipe up/down or ctrl-up/down). But mostly I just use cmd-tab and haven't really needed to see previews there. Because macOS switches between applications, not windows, often there isn't single window to preview, and I'm not sure showing all the windows would work well either. For Expose it works well because the it can use the entire screen to show previews.
Agreed, as a software engineer of ~8 years now Mac is actually my _preferred_ environment -- I find it an extremely productive OS for development whether I'm working on full stack or Unity game dev in my free time.
I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
> I don't agree with OP's sentiment that macOS is a bad dev environment, but surely I prefer Linux+KDE as an overall dev environment. I find that all the tools I need are there but that I'm fighting the UI/UX enough to make it a hassle relative to KDE.
This sounds like you think macOS is a good dev environment, but that you personally don't like the UI/UX (always safer to make UI/UX judgements subjective ["I don't like"] rather than objective ["it's bad"], since it's so difficult to evaluate objectively, e.g., compared to saying something like Docker doesn't run natively on macOS, which is just an objective fact).
I literally started off my comment saying that it's not bad. That means it's neutral-good, by definition.
I can easily develop on both, I prefer developing on Linux. Thus, it is "more good" (IMO), if you prefer.
I've been on a mac for ~4 years now.
It was a bit of a struggle to get used to it, coming from windows.
The only thing I really miss now is alt-tab working as expected. (It's a massive pain to move between two windows of the same program)
You know you can use CMD+backtick (CMD+`) to cycle between windows of the same app? Add shift to go in reverse.
Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
This usually doesn't work for me.
For example, if I open a new Firefox window, the Mac seems to force the two Firefox windows onto different desktops. This already is a struggle, because sometimes I don't want the windows to be on two desktops. I find that if I try to move one window to the same desktop as the other, then Mac will move the other desktop to the original desktop so they are both still on different desktops.
OK, got sidetracked there on a different annoyance, but on top of the above, CMD-backtick doesn't usually work for me, and I attribute it to the windows typically being forced onto different desktops. Some of the constraints for using a Mac are truly a mystery to me, although I'm determined to master it eventually. It shouldn't be this difficult though. For sure, Mac is nowhere near as intuitive as it's made out to be.
To reproduce, get a second monitor, throw your web browser onto that second monitor (not in full screen), and then open a application into full screen on your laptop's screen (I frequently have a terminal there). Then go to a site that gives you a popup for OAuth or a Security Key (e.g. GitHub, Amazon, Claude, you got a million options here). Watch as you get a jarring motion on the screen you aren't looking at, have to finish your login, and then move back to where you were.
Everyone tells me how pretty and intuitive they are yet despite being on one for years I have not become used to them. It is amazing how many dumb and simple little problems there are that arise out of normal behavior like connecting a monitor. Like what brilliant engineer decided that it was a good idea to not allow certain resolutions despite the monitor... being that resolution? Or all the flipping back and forth. It's like they looked at the KDE workspaces and were like "Let's do that, but make it jarring and not actually have programs stay in their windows". I thought Apple cared about design and aesthetics but even as a Linux user I find these quite ugly and unintuitive.Full-screen windows (little green button at the top) seem to get "their own desktop". Has tripped me up a few times.
Or just put a program onto a second monitor then open a second window for that program. Usually it will not open in the same monitor. This is especially fun when you get pop-ups in browsers...
> Or otherwise you can enable the app exposé feature to swipe down with three fingers and it will show you only windows of the same app.
If you have an Apple keyboard, CTRL-F3 (without the Fn modifier) will do the same. Not sure if there are third-party keyboards that support Mac media keys, but I'm guessing there are some at least...
Only sometimes it doesn't work. (For me on a Norwegian keyboard it is CMD+<)
Specifically, sometimes it works with my Safari windows ans sometimes it doesn't.
And sometimes when it doesn't work, Option+< will work for some reason.
But sometimes that doesn't work either and then I just have to swipe and slide or use alt-tab (yes, you can now install a program that gives you proper alt-tab, so I do not have to deal with this IMO nonsense, it just feels like the right thing to do when I know I'm just looking for the other Safari window.)
I'm not complaining, I knew what I went to when I asked $WORK for a Mac, I have had one before and for me the tradeoff of having a laptop supported by IT and with good battery time is worth it even if the UX is (again IMO) somewhat crazy for a guy who comes from a C64->Win 3.1->Windows 95/98->Linux (all of them and a number of weird desktops) background.
That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
GNOME does this much better, as it instead uses Super+<whatever the key above Tab is>. In the US, that remains ` but elsewhere it's so much better than on MacOS.
> That has terrible ergonomics for anyone using a non-US keyboard, though - the backtick is immediately above the option key so to hit together with CMD requires clawing together your thumb and little finger.
That's true, hence why I remap it to a "proper" key, above Tab with:
I’m a lifelong Mac user and didn’t know this.
Shame on me.
And there's more from where that came from: https://support.apple.com/en-us/102650
Apple really doesn't tell power-users about a lot of these features. You can really gain a lot by searching for Mac shortcuts and tricks. I still learn new things that have been around for over a decade.
I'd argue if you need to be told about keyboard shortcuts, then you're not a power user. (I.e., knowing how to find keyboard shortcuts I'd consider a core trait of power users).
Keyboard shortcuts should be exposed in some fashion. IMO, Microsoft is typically better at this.
What specifically does Microsoft do that Apple should do?
As a hybrid macOS / Windows user (with 20+ years of Windows keyboard muscle memory), I found Karabiner Elements a godsend. You can import complex key modifications from community built scripts which will automatically remap things like Cmd+Tab to switch windows, as well as a number of other Windows hotkeys to MacOS equivalents (link below):
https://karabiner-elements.pqrs.org/
https://ke-complex-modifications.pqrs.org/?q=windows#windows...
https://alt-tab-macos.netlify.app/
I don't think I could survive on MacOS without AltTab.
Play with your keyboard, alt, ctrl, cmd + tab or ~ or combos of those will do wild things for ya
Apparently i'm the only one who didn't know about cmd+`
You know about Cmd+Backtick, do you?
CMD + ~.
The biggest problem with Linux is poor interfaces[0] but the biggest problem with Apple is handcuffs. And honestly, I do not find Apple interfaces intuitive. Linux interfaces and structure, I get, even if the barrier to entry is a big higher, there's lots of documentation. Apple less so. But also with Apple there's just things that are needlessly complex, buried under multiple different locations, and inconsistent.
But I said the biggest problem is handcuffs. So let me give a very dumb example. How do you merge identical contacts? Here's the official answer[1]
Well guess what? #2 isn't an option! I believe this option only appears if you have two contacts that are in the same address book. Otherwise you have the option "Link Selected Cards". Something that isn't clear since the card doesn't tell you what account it is coming from and clicking "Find duplicates" won't offer this suggestion to you. There's dozens of issues like this where you can be right that I'm "holding it wrong", but that just means the interface isn't intuitive. You can try this one out. You can try this out. Go to your contacts, select "All Contacts" and then by clicking any random one try to figure out which address book that contact is from. It will not tell you unless you have linked contacts. And that's the idiocracy of Apple. Everything works smoothly[2] when you've always been on Apple and only use Apple but is painful to even figure out what the problem even is if you have one. The docs are horrendous. The options in the menu bar change and inconsistently disappear or gray out, leading to "where the fuck is that button?".So yeah, maybe a lot of this is due to unfamiliarity, but it's not like they are making it easy. With Apple, it is "Do things the Apple way, or not at all". But with Linux it is "sure whatever you say ¯\_(ツ)_/¯". If my Android phone is not displaying/silencing calls people go "weird, have you tried adjusting X settings?" But if my iPhone is not displaying/silencing calls an Apple person goes "well my watch tells me when someone is calling" and they do not understand how infuriating such an answer is. Yet, it is the norm.
I really do want to love Apple. They make beautiful machines. But it is really hard to love something that is constantly punching you in the face. Linux will laugh when you fall on your face, but it doesn't actively try to take a swing or put up roadblocks. There's a big difference.
[0] But there's been a big push the last few years to fix this and things have come a long way. It definitely helps that Microsoft and Apple are deteriorating, so thanks for lowering the bar :)
[1] https://support.apple.com/guide/contacts/merge-contact-cards...
[2] Except it actually doesn't
> With Apple, it is "Do things the Apple way, or not at all".
Well kinda, you don't have to use all that much Apple software on macs though. If you can live with the window manager / desktop environment then you can use whichever apps you choose for pretty anything else.
> The window manager. I'm not a fan of all the animations and needing to gester between screens (and yes, I've been down the hotkeys rabbit hole). To install a 3rd party window manager, you need to disable some security setting because appearantly they work by injecting into the display manager and calling private APIs.
Specifically for this, there's Aerospace (https://github.com/nikitabobko/AeroSpace) which does not require disabling SIP, intentionally by the dev.
For using the vanilla macOS workspaces though, if you avoid using full screen apps (since those go to their on ephemeral workspace that you can't keybind for some stupid reason), if you create a fixed amount of workspaces you can bind keyboard shortcuts to switch to them. I have 5 set up, and use Ctrl+1/2/3/4/5 to switch between isntead of using gestures.
Apart from that, I use Raycast to set keybindings for opening specific applications. You can also bind apple shortcuts that you make.
Still not my favorite OS over Linux, but I've managed to make it work because I love the hardware, and outside of $dayjob I do professional photography and the adobe suite runs better here than even my insanely overspeced gaming machine on Windows.
Mac laptop hardware is objectively better, but I am on the same camp as the parent post. For most development workflows, Linux is my favorite option. In particular, I think NixOS and the convenience of x86_64 is usually worth the energy efficiency deficit with Apple M.
It will be interesting to see how this evolves as local LLMs become mainstream and support for local hardware matures. Perhaps, the energy efficiency of the Apple Neural Engine will widen the moat, or perhaps NPUs like those in Ryzen chips will close the gap.
I develop using a MacBook because I like the hardware and non-development apps but all my accrual work happens on a Linux server I connect to. It's a good mix.
As a long term Mac user who works on ROS a lot I hear you. Most people here think local dev means developing a React app. Outside of mainstream web frameworks Mac sucks for local dev.
I’ve found Macs to be good for most dev stuff with the exception of non-Dockerized C++.
Unfortunately I do a lot of C++… I hate the hoops you have to go through to not use the Apple Clang compiler.
Is there a HN bingo card? Because we always get a top comment for Linux user who tries Mac and decides they prefer Linux.
I kinda agree with the OP, but then I was a Linux user for well over a decade. I do think that C/C++ libraries are much, much more of a pain on Mac as soon as you go off the beaten path (compiling GDAL was not pleasant, whereas it would be a breeze on Linux).
Some of this is probably brew not being as useful as apt, and some more of it is probably me not being as familiar with the Mac stuff, but it's definitely something I noticed when I switched.
The overall (graphical) UI is much fluider and more convenient than Linux though.
I have to agree. The loss of sense of reality among Linux fanboys is really annoying.
I had been a Linux notebook user for many years and have praised it on this board years ago. But today the Linux desktop has regressed into a piece of trash even for basic command line usage while providing zero exclusive apps worth using. It's really sad since it's unforced and brought upon Linux users by overzealous developers alone.
Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation. Available PC notebook HW is a laughable value compared to even an entry level Apple MacBook Air. Anecdata but I have no less than five "pro" notebooks (Dell Lattitude, XPS, and Lenovo Thinkpad) come and go with basic battery problems, mechanical touchpad problems, touchpad driver issues, WLAN driver issues, power management issues, gross design issues, and all kind of crap come and go in the last five years so I'm pretty sure I know what I'm talking about.
The one thing Mac isn't great for is games, and I think SteamOS/Proton/wine comes along nicely and timely as Windows is finally turning to the dark side entirely.
> Mac OS trounces Linux in absolutely every way on the desktop it's not even funny: performance, battery life, apps, usability, innovation.
performance - I don't agree battery life - absolutely apps - absolutely usability - I don't agree innovation - I don't agree
One significant annoyance associated with Linux on a laptop is that configuring suspend-then-hibernate is an arduous task, whereas it just works on a Macbook.
But, the main thing is commercial application support.
I can relate. I've spent almost 30 years working primarily on Linux. I moved Windows to be under VM when I needed it around for occasionally using MS Office, first under vmware and later under kvm. Now I don't even use it as a VM, since work has Office 365.
My work got me a similar M4 MacBook Pro early this year, and I find the friction high enough that I rarely use it. It is, at best, an annoying SSH over VPN client that runs the endpoint-management tools my IT group wants. Otherwise, it is a paperweight since it adds nothing for me.
The rest of the time, I continue to use Fedora on my last gen Thinkpad P14s (AMD Ryzen 7 PRO 7840U). Or even my 5+ year old Thinkpad T495 (AMD Ryzen 7 PRO 3700U), though I can only use it for scratch stuff since it has a sporadic "fan error" that will prevent boot when it happens.
But, I'm not doing any local work that is really GPU dependent. If I were, I'd be torn between chasing the latest AMD iGPU that can use large (but lower bandwidth) system RAM versus rekindling my old workstation habit to host a full size graphics card. It would depend on the details of what I needed to run. I don't really like the NVIDIA driver experience on Linux, but have worked with it in the past (when I had a current gen Titan X) but also did OpenCL on several vendors.
Speaking of the P14s, I have an Intel version from 2 years back and battery life is poor. And I hunger for the mac's screen for occasional photography. The other thing I found difficult is that there's no equivalent of the X1 Carbon with an AMD chip. It's Intel only. The P14s is so much heavier.
I thought the same thing when I saw the M5 in the news today. It’s not that I hate macOS 26, hate implies passion.. what I feel is closer to disappointment.
The problem is their philosophy. Somewhere along the way, Apple decided users should be protected from themselves. My laptop now feels like a leased car with the hood welded shut. Forget hardware upgrades, I can’t even speed up animations without disabling SIP. You shouldn’t have to jailbreak your own computer just to make it feel responsive.
Their first-party apps have taken a nosedive too. They’ve stopped being products and started being pipelines, each one a beautifully designed toll booth for a subscription. What used to feel like craftsmanship now feels like conversion-rate optimization.
I’m not anti-Apple. I just miss when their devices felt like instruments, not appliances. When you bought a Mac because it let you create, not because it let Apple curate.
> I still don't have it fully set up
Highly recommend doing nix + nix-darwin + home-manager to make this declarative. Easier to futz around with.
Seconded. I have a mostly CLI setup and in my experience Nix favors that on Mac, but nonetheless it makes my Nix and Linux setups a breeze. Everything is in sync, love it.
Though if you don't like Nixlang it will of course be a chore to learn/etc. It was for me.
drop $20/month for any LLM and you don't even have to learn it!
LLMs are uniquely bad at writing nix configs I've found. The top models all regularly hallucinate options
Really useful for debugging though
Really? This surprises me. I've used them for projects and for my home-manager setup and it's always been amazing at it. The best example I can come up with is packaging a font I needed into a nix package for a LaTeX file. It would have taken me a month of trying various smaller projects to know how to do that.
Honestly it helped quite a bit. There are a lot of obscure (imo) errors in Nix that LLMs spot pretty quickly. I made quite a bit of progress since using them for this.
Yeah it's like 99% responsible for all of my flake files; I wasn't being facetious!
Yes, macOS sucks compared to Linux, but the m chip gets absolutely incredible battery life, whereas the framework gets terrible battery life. I still use my framework at work though.
Yes, there is a dilemma in the Linux space. But is running Linux on a MacBook a viable option these days? Is Ashahi Linux solid enough?
I much prefer a framework and the repairability aspect. However, if it's going to sound like a jet engine and have half the battery life of a new m series Mac. Then I feel like there's really no option if I want solid battery life and good performance.
Mac has done a great job here. Kudos to you, Mac team!
Oh, I have that tab still open from when I was reading the other thread. Here is the feature support from Asahi. Still a way to go unless you are on an old M1 looks like?
https://asahilinux.org/docs/platform/feature-support/overvie...
I've been forced to use Macbooks for development at work for the past 7 years. I still strongly prefer my personal Thinkpad running Debian for development in my personal life. So don't just put it down to lack of familiarity.
Try Aerospace. Completely solved window management for me.
Also for dev, set up your desired environment in a native container and then just remote into it with your terminal of choice. (Personally recommend Ghostty with Zellij or Tmux)
I appreciate this comment!
I'm often envious of these Macbook announcements, as the battery life on my XPS is poor (~2ish hours) when running Ubuntu. (No idea if it's also bad on Windows - as I haven't run it in years).
Thanks for the heads-up.
Not that useful as a heads up.
MacOS is great for development. Tons of high profile devs, from Python and ML, to JS, Java, Go, Rust and more use it - the very people who headline major projects for those languages.
2ish hours battery life is crazy. It's 8+ hours with the average Macbook.
> It's 8+ hours with the average Macbook.
Did I get a dud? I rarely get over 2.5
If you are on a M-series MacBook and aren't running a 3D Benchmark the entire time, your Mac is broken if it is dying after 2.5 hours.
Have you checked your Battery Health?
If you have an intel-based Mac, it's the same expected battery life as Windows and 2.5 hours on an intel MacBook battery sounds decent for something 5+ years old.
8+ hours sounds about right. I have a M1 Macbook Pro and even 5 years later I can still use it (a million browser tabs, couple containers, messaging apps) for an entire day without having to charge it.
Yeah, you have a dud. Or you have some processing running in the background that's gobbling up all the energy.
With Apple Silicon? Yes, that is very low for typical dev usage.
Gaming is another story though, or any other uses that put a lot of stress the GPU.
Use Aerospace for window management. No animations. No disabling of security. It just works. https://github.com/nikitabobko/AeroSpace
I get the comment about Docker. Not being able to share memory with docker makes it a pain to use to run things alongside mac, unless you have mountains of ram.
Hello! Yes! Writing this from my commute home using my companies M3 Pro and I hate it. I'm waiting for a new joiner so I can hand this off to a new starter who has a different brain to me.
I can write up all the details, but it's well covered on a recent linuxmatters.sh and Martin did a good job of explaining what I'm feeling: https://linuxmatters.sh/65/
What are the quirks with local dev that make it not fun?
There are surprisingly a lot of permission headaches and rug pulls in the last few big OS updates that have been really annoying.
Any examples? I've been using mbpr since 2014 and haven't seen any changes recently except you have to hit "allow" button a few times.
They're basically things along those lines. They're more nefarious when background services quietly error out and you need to dig to find it was a newly required permission.
Launching unsigned app now requires to go to settings manually and allow it there instead of just allowing on launch
I use macOS, with all kinds of languages locally, plus vms, kubernetes, LLMs, etc and seen no such issues.
What "permission headaches"?
Launching unsigned apps is a problem, especially if an app bundle contains multiple binaries, since by default you need to approve exception for each of them separately.
I know that it's possible to script that since Homebrew handles it automatically, but if you just want to use a specific app outside of Homebrew, experience is definitely worse than on Linux/Windows.
There are a lot of annoying hurdles when allowing some types of application access. Needing to manually allow things in the security menu, allowing unrecognized developers, unsigned apps. Nothing insurmountable so far, but progressively more annoying for competent users to have control over their devices.
For me, who came from linux the only thing I don't like is the overview menu's lack of an (x) to close a window. The way slack stacks windows within the app so it's hard to find the right one. Pressing the red button doesn't close the app from appearing in your CMD+Tab cycle between apps, you also have to press CMD+Q. (Just a preference to how windows and linux treat windows, actually closing them. Rectangle resolved the snap to corner thing (I know MacOS has it natively too but it's not too great in comparison).
Things I prefer: Raycast + it's plugins compared to the linux app search tooling, battery life, performance. Brew vs the linux package managers I don't notice much of a difference.
Things that are basically the same: The dev experience (just a shell and my dotfiles has it essentially the same between OS's)
I think the hardest part for me, is getting used to using CMD vs CTRL for cut-copy-paste, then when I start to get used to it... in a terminal, it breaks me out with a different key for Ctrl+C. I got used to Ctrl+Shift for terminals in Linux (and Windows) for cut-copy-paste, etc.
It may seem like a small thing, but when you have literal decades of muscle memory working against you, it's not that small.
I'm a lifelong Mac user, so obviously I'm used to using CMD instead of CTRL. Inside the terminal we use CTRL for things like CTRL-C to exit a CLI application.
What messes me up when I'm working on a linux machine is not being able to do things like copy/paste text from the terminal with a hotkey combo because there is no CMD-C, and CTRL-C already has a job other than copying.
IMO apple really messed up by putting the FN key in the bottom left corner of the keyboard instead of CTRL. Those keys get swapped on every Mac I buy.
Ctrl+Shift+(X,C,V) tends to work for many/most terminals in Linux and Windows (including Code and the new Terminal in Windows)...
I agree on the Fn key positioning... I hate it in the corner and tend to zoom in when considering laptops for anyone just in case. I've also had weird arrow keys on the right side in a laptop keyboard where I'd hit the up arrow instead of the right shift a lot in practice... really messed up test area input.
I knew there must be some extra hot key, but like you said, muscle memory.
It's the same thing when switching from a Nintendo to a Western game where the cancel/confirm buttons on the gamepads are swapped.
As a very long-term Linux user, I'm still aggravated when implicit copy and middle-click paste doesn't just work between some apps, since it is so deeply embedded in my muscle memory!
I'm only a recent MacOS user after not using it for over 20 years, so please people correct me if I'm wrong.
But in the end the biggest thing to remember is in MacOS a window is not the application. In Windows or in many Linux desktop apps, when you close the last or root window you've exited the application. This isn't true in MacOS, applications can continue running even if they don't currently display any windows. That's why there's the dot at the bottom under the launcher and why you can alt+tab to them still. If you alt+tab to an app without a window the menu bar changes to that app's menu bar.
I remember back to my elementary school computer lab with the teacher reminding me "be sure to actually quit the application in the menu bar before going to the next lesson, do not just close" especially due to the memory limitations at the time.
I've found once I really got that model of how applications really work in MacOS it made a good bit more sense why the behaviors are the way they are.
docker being nerfed is pretty much the only thing I can think of.
Have you tried Apple's "container" tool?
https://github.com/apple/container
Docker works very weirdly (it's a desktop application you have to install that has usage restrictions in enterprise contexts, and it's inside a VM so some things don't work), or you have to use an alternative with similar restrictions (Podman, Rancher Desktop).
The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible. There are things (e.g. installing drivers to be able to connect to ESP32 devices) that require jumping through multiple ridiculous hoops. Some things are flat out impossible. Each new OS update brings new restrictions "for your safety" that are probably good for the average consumer, but annoying for people using the device for development/related.
Dovker on mac has one killer feature though: bindmounts remap permissions sensibly so that uid/gid in the container is the correct value for the container rather than the same uid/gid from the host.
the workarounds on the internet are like "just build the image so that it uses the same uid you use on your host" which is batshot crazy advice.
i have no idea how people use docker on other platforms where this doesn't work properly. One of our devs has a linux host and was unable to use our dev stack and we couldn't find a workaround. Luckily he's a frontend dev and eventually just gave up using the dev stack in favour of running requestly to forward frontend from prod to his local tooling.
>The OS also has weird rough edges when used from the terminal - there are read-only parts, there are restrictions on loading libraries, multiple utilities come with very old versions or BSD versions with different flags than the GNU ones you might be used to coming from Linux, the package manager is pretty terrible.
You use nix or brew (or something like MacPorts).
And they are mighty fine.
You shouldn't be concerned with the built-in utilities.
IIRC many of the built-in tools were updated from FreeBSD in the last release, but they'd still be different from GNU.
Brew is pretty terrible though. It's slow, and doesn't handle updates/versions/dependencies all that well.
I've had it make major (with breaking changes) updates to random software when asked to install something unrelated.
I suggest trying Nix on Macos, it is very nice as a package manager but also it can be used as a way to replace Docker (at least for my needs, it works very well). This days I don't even bother installing brew on my Mac, I only use Nix.
Very interesting. I’m going to start using Nix it seems based off skimming how it works and can replace docker.
>but local dev on a Mac is not fun
What are the differences though? I have mbpr and a pc with Fedora on it and I barely see any differences aside from sandboxing in my atomic Kinoite setup and different package manager.
People often hating on brew but as a backend dev I haven't encountered any issues for years.
I can tell you in one sentence: try to have a DNS server when mDNSResponder sits on port 53 (for example because you use the new virtualization framework).
And there are a lot of such things, which are trivial or non problem in Linux.
The issues I see people struggle with on a Mac is that development often needs things in a non-default and often less-secure setup.
There isn't a "dev switch" in macOS, so you have to know which setting is getting in your way. Apple doesn't like to EVER show error alerts if at all possible to suppress, so when things in your dev environment fail, you don't know why.
If you're a seasoned dev, you have an idea why and can track it down. If you're learning as you go or new to things, it can be a real problem to figure out if the package/IDE/runtime you're working with is the problem or if macOS Gatekeeper or some other system protection is in the way.
I also like the multi desktop experience on KDE more, but I‘ve recently found out you can at least switch off some of the annoying behavior in the Mac settings, so that e.g it no longer switches to another desktop if you click on a dock icon that is open on another desktop
Since I mostly live in the terminal (ghostty) or am using the web browser I usually don't have to deal with stupid Apple decisions. Though I've found it quite painful to try to do some even basic things when I want to use my Macbook like I'd use a linux machine. Especially since the functionality can change dramatically after an update... I just don't get why they (and other companies) try to hinder power users so much. I understand we're small in numbers, but usually things don't follow flat distributions.
There's often better ways around this. On my machine my OSX config isn't really about specifically OSX but what programs I might be running there[0]. Same goes for linux[1], which you'll see is pretty much just about CUDA and aliasing apt to nala if I'm on a Debian/Ubuntu machine (sometimes I don't get a choice).I think what ends up being more complicated is when a program has a different name under a distro or version[2]. Though that can be sorted out by a little scripting. This definitely isn't the most efficient way to do things but I write like this so that things are easier to organize, turn on/off, or for me to try new things.
What I find more of a pain in the ass is how commands like `find`[3] and `grep` differ. But usually there are ways you can find to get them to work identically across platforms.
But yeah, I don't have a solution to this... :([0] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[1] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[2] https://github.com/stevenwalton/.dotfiles/blob/master/rc_fil...
[3] https://github.com/stevenwalton/.dotfiles/tree/master/rc_fil...
I hear you. Apple hw and Linux combination would be have been great for me.
How about Asahi Linux, or a Fusion/Parallels VM on macOS?
Can't you literally install Linux on Apple hardware?
Yes, with major tradeoffs. Asahi Linux is an amazing project, but they have not yet figured out how to get anywhere close to a Mac's power efficiency when it is running MacOS. For example, you will lose a lot of battery life[0][1] with the lid closed, whereas on MacOS you lose pretty much nothing.
Also, note that thunderbolt not yet supported[2].
[0] https://web.archive.org/web/20241219125418/https://social.tr... [1] https://github.com/AsahiLinux/linux/issues/262 [2] https://asahilinux.org/docs/platform/feature-support/overvie...
I recommend looking at Lima for setting up deterministic build environments on a Mac! I use it with Ansible to provision testing environments:
https://github.com/lima-vm/lima
macOS has a different dev culture than Linux, but you can get pretty close if you install the Homebrew package manager. For running LLMs locally I would recommend Ollama (easy) or llama.cpp. Due to the unified memory, you should be able to run larger models than what you can run on a typical consumer grade GPU, but slower.
For me a VM set up via UTM works quite well on my Mac. Just make sure you do not virtualize x86, that kills both performance and battery life. This way I get the nice battery life and performance in a small packge but am not limited by MacOs for my development.
local-development has been fine for me on m4 pro sequoia (i switched from archlinux), not much different
but i absolutely hate MacOS26, my next laptop won't be a macbook
It's a shame what they did to this awesome hardware with a crappy update
My issue is how much I care about looks. If it’s not pretty, I have a harder time using stuff outside the CLI/TUI.
Linux is too ugly for me to use as my main device. Same with what I’ve seen of Android.
I suggest you head over to /r/unixporn, and you'll probably be presently surprised. Contrary to popular belief, most of this stuff is not very hard to setup. Of course, there are people also showing off custom tooling and the craziest (and sometimes silliest) things they can pull off, but a beautiful interface is usually something you can do in under an hour. It is customary to include a repo with all configurations, so if you wanted to direct copy paste, you can do it much faster than that.
Unless you're talking about the look of the physical machine. Well then that's an easier fix ;)
https://www.reddit.com/r/unixporn/
Pretty funny how they're actively targeting M1 users with their marketing copy with this release.
Sorry you made your first gen chip so good that I don't feel the need to upgrade lol.
Yeah, I am using a 2020 Macbook Air M1 with macOS 15.7.1 (which I am about to install Ashasi Linux) and I have no issues as a casual user. For most people who use macbooks I see no reason to but an M5 or M4 over an M1.
How’s that running on M machines these days?
asahi runs great on m1, m2- but not on the newer chips.
One of the problems is that I don’t notice a meaningful difference(that’s worth the money) between my M1 and my M4 workloads. (Dev/video). Obviously the rendering is faster but the OS isn’t. Tahoe makes my M2 feel like an intel mac.
Chip, memory and storage are really fast, but I’m fully convinced that the OS is crippling these machines.
It's unfortunate that MacOS continues to get worse as the hardware gets better.
> Sorry you made your first gen chip so good that I don't feel the need to upgrade
M1 MacBooks are ~5 years old at this point, and if you've been working a laptop hard for 5 years it's often worth getting an upgrade for battery life as well as speed.
I still have 94% of my battery health left on my 2020 M1, and I have all the speed I need as a casual, avergae user.
Crazy good. I only have 87% on mine and I think I got it in 2022 (M1 though)... I wonder if it's because I leave it plugged in so much.
I saw it first on using a device 2 years older than your M1, so you might be in a similar boat soon - but I hope not.
The lower power and heat of M-devices might result in meaningfully longer battery life, and I'm curious to find out.
frankly nothing holds a candle to the battery performance of the M series machines so it’s likely a safe bet to assume that advantage will also translate into longer overall life/battery health until we see otherwise. We’ll see in a few years I suppose.
Battery power on M1 16” was so good when new that even severely degraded is still pretty good.
I felt the same way about the battery in my 2018 MacBook ... it was losing capacity, but I didn't mind as it still ran for hours between charged.
Then it started having issues waking up from sleep. Only the OG Apple charger could wake it up, then it would see it actually had 40-60% battery but something had gone wrong while sleeping and it thought it was empty.
Intel MacBooks had terrible SMC issues, so maybe this won't afflict the M-series. Just sharing because I could still use that MacBook a few hours between charged, it just couldn't be trusted to wake up without a charger. That's really inconvenient and got me to upgrade combined with new features.
um no; that's a reason to upgrade the battery, not get a new laptop
How much would you charge me to swap out my MacBook Pro 2018 15.4" battery using authorized methods to not cause other damage? I want my laptop back within a few days, a 90 day warranty on parts and labor, and I want a genuine Apple battery - not some unknown 3rd party.
I got that as well.. more annoying are comparisons with the last Intel options, which sucked then.
I'm still doing fine with a 16gb M1 Air, I mostly VPN+SSH to my home desktop when I need more oomph anyway. It lasts a full day, all week when you just check email on vacation once a day.
I have a 2020 Macbook, intel... 2.3 GHz Quad-Core Intel Core i7 is it worth upgrading?
Very much so.
No fan noise, no warmth, unless you are really really pushing it.
in terms of speed, it makes it feel like the original retina did when they first came out. oh and a pretty fast disk as well.
>makes it feel like the original retina
Exactly right. M1 MacBook Pro delighted me in a way that Macs haven't done since my 2013 Retina MBP
In terms of performance, thermals, and battery life, it was a huge upgrade for me when I moved from Intel to the M1 Max. M1 Max to the M4 Max... improvements were mainly on very heavy tasks, like video transcoding with Handbrake or LLMs with Ollama.
Yes... to any Apple Silicon machine.
Just give me cellular in a MacBook Air already Apple if you want me to insta-buy! Bonus points for OLED.
Air’s don’t have to be just cheap. I want a thin and light premium laptop for walking around and a second Mac (of any type) for my desk.
I'm guessing carriers/networks can't handle a fleet of MacBooks-with-cellular yet. The data workload would be sustained and intense with macOS not having the type of system-level cellular framework/data control as iPad and iOS (I have used the low data mode on macOS, it helps but only handles a small part of the problem).
I have bought cracked-screen iPhones since Personal Hotspot allowed wired connections back in the 2000s, velcro'd them to the back of my MacBook screen and have been living the "I have internet on my Mac everywhere" life since then. With 5G, I can't really tell when I'm on Wi-Fi vs. when my MacBook opts for the hotspot connection.
I'd love a cellular MacBook and would also insta-buy, but I've given up hope until the next network upgrade.
That doesn’t make much sense to me, there are literally billions of phones that people are using all the time.
Apple has over 2.3 billion active devices of which a small percentage are Macs (an estimated 24 million were sold in 2024 and around twice that in iPads).
The most difficult to scale part of a cell network is number of devices connected, not bandwidth used anyway and cellular Macs aren’t going to add significantly more load to a network. And that assumes that Apple even cares what a carrier thinks.
I’m in Australia, not the USA, and for all people like to complain about internet here, we have excellent mobile coverage and it’s relatively affordable, but it’s all priced by usage.
I have 4 devices on my plan with shared 210GB of 4G usage between them for around AUD$200 (USD$130) a month on Australia’s best network (Telstra). I work remotely from cafes a lot (probably around 20-30 hours a week) as a developer and get nowhere close to that usage. I update all my apps, download all my podcasts, listen to lossless music and stream video whenever I want during breaks (although I’m not a huge out-of-home video consumer). I do literally nothing to limit my bandwidth usage and am lucky to use 30-40GB a month across all my devices.
Osx has, for a little while now, had a "metered" flag for networks. Not sure which apps, if any, respect it, but it's there
> (I have used the low data mode on macOS, it helps but only handles a small part of the problem)
Yes, I mentioned that in the post you responded to.
> Not sure which apps, if any, respect it, but it's there
It reduces data consumption for me about 1/5. Not nothing, but the Mac can easily consume hundreds of GB of data a week doing "normal" activities. YouTube on a MacBook is many times more data than the equivalent on a phone screen.
I've heard (but not tested) that Tahoe and iOS 26 do a _much_ better job of auto-connecting and reconnecting (if your cell drops, like going through a tunnel or similar) to make it easier to use your phone with your MBP.
I hope this is the case. I don't know if I would buy a cellular MBP (just wouldn't use it enough) but better tethering is a huge win for me.
I know minis don't sell well but I wish they kept the Air 11" format but without the bezel one way or another
My craving has been answered by the GPD WIN MAX 2, a 10" mini laptop with lots of ports and bitchin' performance (AI 9 chip sips battery). It's windows, but an upgrade to pro to disable the annoying stuff via group policy + never signing into a Microsoft account, it's amazing how much faster it is than a machine that's always trying to authenticate to the cloud before it does anything. Wake from sleep is also excellent which was the main thing that kept me using MacBooks. Anyway it's the first computer I've bought in a decade that has some innovation to it.
Edit: there's a slot for a cellular modem but I haven't done enough research to find one that will play nice on US networks
Why carry around two cellular modems? Are you ever out and about with your computer but not your phone? I've been happy to hotspot my computers and tablets to my phone, which I always have with me.
The only possible issue I can think of is battery life, but if I'm carrying around my laptop I can throw a charge cable in the bag to keep my phone juiced.
I want my computer to have an always on cell modem just like my phone does.
The Apple Silicon chips all run in a version of always on these days because the efficiency cores are so, well, efficient.
Additionally, while you may want to burn the battery in multiple devices and deal with having to manage that, I don’t want to.
Apple has been selling cellular iPads since the beginning and I love never having to worry about pairing mine.
Tethering to an iPhone or iPad Is much better than it used to be, but it’s still not perfect.
Apple makes their own modems these days and even with Qualcomm had a capped per device license fee more than covered by the premium they charge for cellular in, say, the iPad.
I know so many people who want this convenience and are willing to pay for it that it just seems like stubbornness at this point that they’re willing to put modems in iPads and not MacBooks.
I leave my setup plugged in, using a low-profile USB-C to lightning cable on the iPhone SE stuck on the back of my screen and wired hotspot on macOS is a great experience.
We're discussing a MacBook someday with a built-in phone, the closest I've found is an iOS device wired to my MacBook as a wired hotspot. It's like having fast wifi everywhere.
Using my personal phone (that I also use for other things like calls) wouldn't be like having wifi everywhere on my Mac, for example if I walk away from my laptop while on the phone the Mac would lose internet.
The pairing has become almost flawless as well. Years ago, it was slow and inconsistent, but now the hotspot feature is almost perfect and automatic. Honestly, I don’t really think about it anymore.
For tablets, at least with T-Mobile, for $25 you get unlimited data. You only get a limited amount of tethering data.
True, but don't you have to treat the device as a separate line? If the laptop had cellular I'd have another bill to pay.
T-mobile is so scammy, though. Have you been keeping up with all the lawsuits against them in the US?
All carriers are scammy in their own way
MVNOs FTW. They know they're competing for price-conscious consumers so have to offer more value. The big 3 know most of their customers are going to go with one of the big boys, all of whom are expensive and not great.
MVNOs have slower data rates since they buy deprioritized traffic in bulk, don’t have the roaming agreements domestically and especially not internationally, and don’t offer unlimited high speed data.
So, maybe don't do business with the worst of them? Of the big 3 in the US, T-Mobile is the one I'd avoid right now.
Verizon is overpriced and is the Comcast equivalent of cellular carriers. AT&T is about as bad
Verizon owns most of the value brands in the US and you made no connection between them and Comcast beyond mentioning them in the same sentence.
AT&T is "about as bad" as what? You gave no information.
Price, customer service
Based on the type of responses you are giving, I actually do believe you probably call your phone company's customer service regularly. So perhaps your criteria might be different. Have you heard of Consumer Cellular?
Consumer cellular rate plans are the same price as T-Mobile and they don’t have international roaming included.
Apparently, in Europe, the box will not contain a charger [1]. This is absolutely mind-blowing to me.
edit: suggested retail price also dropped with EUR 100. Mind is less blown now. It seems like a good thing in fact.
edit2: in Belgium, the combined price of the 70W adapter and 2m USB-C to MagSafe is EUR 120.
[1] https://forums.macrumors.com/threads/new-macbook-pro-does-no...
Removing the charger is a good move, in my opinion.
USB-C chargers are everywhere now. Monitors with USB-C or Thunderbolt inputs will charge your laptop, too. I bought a monitor that charges over the USB-C cable and I haven’t use the charger that came with the laptop in years because I have a smaller travel charger that I prefer for trips anyway.
You don’t have to buy the premium Apple charger and cable. There are many cheap options.
I already have a box of powerful USB-C chargers I don’t use. I don’t need yet another one to add to the pile.
The battery is good enough that I often travel with just my phone charger. I can plug the laptop in at night when the slow charge rate isn't a hindrance and be fine with the all day battery life
I was really surprised to find that the MacBook didn't mind charging from the 20W phone brick
it will even charge with the old USB-A brick.
Takes like 10 hours and isn't officially supported I think, but it does work.
>USB-C chargers are everywhere now
USB-C 15W Chargers may be everywhere, but higher power charger required for MacBook Pro is not.
I would have agreed if the devices is using 10W or 20W where you could charge it slightly slower. Not for a 70W to 100W MacBook Pro though.
I've got several 50+W chargers from other devices (old mbp, soldiering iron, a generic one). If you don't have a high power charger, buy one. Easy enough. But there are plenty of use that don't need another.
Every time I’ve gotten rid of an old laptop the charger goes with it. They are a package deal in my book.
I actually have very few USB-C chargers. With everyone leaving them out of the box, I don’t happen to have a bunch of them by chance. They took them out of the box before giving time for people to acquire them organically. I never bought a single lightning cable, but almost all my USB-C cables had to be purchased. This is not great, considering how confusing the USB-C spec is.
Other than the one that came with my M1 MBP (which I will lose when I sell it), I have had to purchase every charger I have.
Not being able to charge a $1,500+ laptop without buying a separate accessory is crazy to me. I’ve also seen many reports over the years comparing Apple chargers to cheap 3rd party ones where there are significant quality differences, to the point of some of the 3rd party ones being dangerous or damaging. I don’t know why Apple would want to open the door to more of that.
I assume a lot of people will use a phone charger, then call support or leave bad reviews, because the laptop is losing battery while plugged in. Most people don’t know what kind of charger they need for their laptop. My sister just ordered a MacBook Air a couple weeks ago and called me to help order, and one of the questions was about the charger, because there were options for different chargers, which confused her and had her questioning if one even came with it or if she had to pick one of the options. This is a bad user experience. She’s not a totally clueless user either. She’s not a big techie, but in offices she used to work with, she was the most knowledgeable and was who they called when the server had issues. She also opened up and did pretty major surgery on her old MacBook Air after watching a couple YouTube videos. So I’d say at least 50% of people know less than her on this stuff.
Apple positions themselves as the premium product in the market and easy to use for the average user. Not including the basics to charge the internal battery is not premium or easy. I can see it leading to reputational damage.
I have some 50+W chargers from old devices also. However, they are much, much bigger than the current ones. Doesn't matter for when my computers is plugged at home, but I wouldn't want to travel with it since it's easily 3x the size/weight.
My MacBook M1 Pro w/ 441 cycles started doing a fun thing where if the battery gets under about 50% and you put it to sleep, the ONLY way to power on the device is to use the exact charger it came with. Higher powered Apple Studio Display PD, or even good 3rd party chargers, do not bring it back to life. This occurs even when the battery has 40-60% remaining if the laptop goes to sleep.
Had a similar issue with my 2018 MBP Intel - the 86/87 Watt Apple charger was the only thing it would come to life with as the battery aged if the device got too low.
My dad had similar trouble with an M1 MacBook Pro that got a depleted battery. Two chargers he had wouldn’t work, but fortunately the Anki charger that I used for my laptop did work with one of the cables that I had (though not another). Once it got a little juice into it, then it was fine and he could switch back to his. But I think he was a bit more careful to avoid total depletion after that.
In 2018 I had a phone that entered a boot loop: battery depleted, plug it in, it automatically starts booting, it stops charging while booting, it dies due to depletion, it recognises it’s plugged in and starts charging, boot, stop, die, start, boot, stop, die… I tried every combination of the four or five cables that I had with a car USB power adapter and someone’s power bank, nothing worked. Diverted on my way home (an 8 hour journey) to buy another phone because I needed one the next day. When I got home, I tried two or three power adapters with all the cables and finally found one combination that worked. I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
> I learned the lesson that day that allowing devices to deplete completely can be surprisingly hazardous.
The solution is to keep your devices charged. This is feasible if you have a few devices. Not practical for someone like me. I have too many devices. I don't use every device daily.
Yes, I don't often let batteries deplete but the issue I'm having on my MacBooks is that they will "die" with plenty of charge (often 40-60% left). But the computer thinks it is at 0% and won't boot past the "plug me in!" screen with anything except the OG charger from the Apple box. As soon as you connect the OG charger, it boots automatically and you see 0% battery go to 40-60% battery. At this point, you can uplug the macbook and use it - as long as you don't put it to sleep. Obviously battery/power related, but the only fix is using the charger that says/does exactly what my MacBook wants. I wonder how Apple handles this on the M5.
If it's dead or if it's low?
In my experience a low-power charger will revive, you just must wait for it to hit enough SOC since it is effectively starting off the battery. This does take a while, but starting dead on a supply that can't guarantee enough power would be dumb.
My M1 Pro with 441 battery cycles won't power back on without the Apple charger it came with if I close the lid or put it to sleep and the battery isn't over 60%... something happens and the computer goes into a sleep state where the battery doesn't drain but no charger except the OG brings it back to life.
Even a Studio Display, which can provide more power than my M1 Pro can use, won't wake it from this state. Apple wants $300 for a replacement battery so I'll just buy a new MacBook at that price, but the charger situation doesn't bode well for M5 MacBook buyers who wonder why their Mac is dead one day (and they just need the exact charger the system wants, but Apple didn't provide it)
>Apple wants $300 for a replacement battery
Looks like iFixit shows thinks it's only a "moderate" difficulty replacement and should only cost you $109
https://www.ifixit.com/Guide/MacBook+Pro+14-Inch+2021+Batter...
Replacing a MacBook battery is a lot of delicate work. Not everyone has steady hands, great eyes, etc. For $300, the Apple Store is a better deal for most (and guaranteed to be a quality battery with warranty) compared to the difficulty of a $110 battery kit.
I don't want to use a 3rd party battery in a device I carry with me most places I go...
USB-C chargers are everywhere but definitely not at the wattage needed to drive a laptop.
I agree. I currently have 2 or 3 brand spanking new chargers sitting in the orginal Mac box.
On the go, I've bought a small GaN with multiple ports. At home, I already have all of my desks wired up with a Usb-c charger.
Adding to the anecdata, the magsafe charger for my M1 MBP has been used a grand total of two times in five years.
When I buy a new laptop and sell my old one, I either have to sell the old one with the charger or keep the charger and sell the old one without. I don't actually have a bunch of spare chargers capable of charging a laptop (phone, sure).
This is especially true for someone moving up to an MBP from an MBA, which takes less juice.
I think it's awesome. I have way to many chargers, specially USB-C, 5, 10, 20, 35, 70, and 95W all over the house and office. If you need one, just shell the extra $100 that corresponds to your needs.
The real crime is that it starts at 1799 euros, which is $2100, vs $1599 in the US, I know US prices are before tax but even with 20% VAT you're far off...
Apple overprices everything in the EU on top of not shipping new features. Currency risk is a thing but nowhere near the premium they charge. I personally vote with my wallet and stopped buying anything from them.
Who don’t you vote with your vote and oppose regs that are preventing Apple from shipping new features in the EU?
Those regulations don't prevent apple from shipping anything, they prevent companies from abusing their users. Apple is free to ship without abusing anyone, but explicitly choose otherwise.
1599 plus a 20% VAT is 1918. That's... far from the worst it's been TBH.
ASUS’s ZenBook Duo (UX84060) is over 50% dearer in Australia than in the USA.
When it was announced, I expected it to be at least 4000 AUD (~2600 USD). When I heard it was starting at 1500 USD instead (~2300 AUD), I was astonished and very excited. And it still is that price… but only in the US. In Australia it is 4000 AUD (the 32GB/1TB model, which is 1700 USD, ~2600 AUD). So I sadly didn’t get one.
Is the rest of the world subsidising the US market, or are they just profiteering in the rest of the world?
The EU requires a 1 year warranty on electronics, where in the US it's only 90 days. The higher cost of electronics reflects this.
It's 2 years according to https://europa.eu/youreurope/citizens/consumers/shopping/gua...
> Under EU rules, if the goods you buy turn out to be faulty or do not look or work as advertised, the seller must repair or replace them at no cost. If this is impossible or the seller cannot do it within a reasonable time and without significant inconvenience to you, you are entitled to a full or partial refund. You always have the right to a minimum 2-year guarantee from the moment you received the goods. However, national rules in your country may give you extra protection.
> The 2-year guarantee period starts as soon as you receive your goods.
> If a defect becomes apparent within 1 year of delivery, you don't have to prove it existed at the time of delivery. It is assumed that it did unless the seller can prove otherwise. In some EU countries, this period of “reversed burden of proof” is 2 years.
> where in the US it's only 90 days
As far as I know, the US has zero warranty laws. It can be zero days.
*) 2 year is the warranty for consumers in the EU. 1 year only for business/enterprise customers
Apple has a 1 year warranty in the US.
https://www.apple.com/legal/warranty/products/embedded-mac-w...
They've always been like this, it's why the market share is as as low as it is vs US.
what happened to US tariffs? how can they be cheaper in the US than EU?
Apple is exempt from tariffs.
Tim Cook kissed the ring. That's all it takes.
Tariffs... Someone has to pay them and it sure isn't Apple.
I think you read him backwards. It’s still cheaper in the US. Tariffs certainly exist in Europe but I’m unaware of any on these laptops and US Tariffs on goods from China don’t apply to goods from China to anywhere else that isn’t the US.
It should rather be "public option healthcare, social safety nets, and a robust surveillance state aren't going to pay for themselves"
Macbooks shipped to Europe don't ever touch US ground (and I'd wager 99.9% of their parts don't either). So US tarriffs should be irrelevant - and the EU doesn't have big China tarriffs outside of EV and solar panel anti-dumping retaliation.
Of course, you’re not wrong.
Apple could subsidize by absorbing part of the tariff in the U.S. and overcharging in the EU.
That said, in the EU we have a two-year warranty.
VAT in the U.S. is no more than 12%.
It's much cheaper in the US
I have a strong feeling Apple is raising prices elsewhere in order to avoid pissing off the notoriously sensitive consumers in America. Sony is doing similar things with making the PlayStation expensive everywhere to make it affordable for Americans. The world is essentially subsidizing the tariffs for Americans.
People in other countries will get pissed but ultimately suck it up and buy a product. People in America will take it as a personal offense due to the current Maoist-style cult of personality, and you'll get death threats and videos of them shooting your products posted onto social media. Just look at what happened to that beer company. No such thing would happen in Germany.
>The world is essentially subsidizing the tariffs for Americans.
I was told the opposite thing would happen. Sounds like a great deal for us Americans!
> I have a strong feeling Apple is raising prices elsewhere in order to avoid pissing off the notoriously sensitive consumers in America.
Or a certain individual…
Many people got played on the charger thing. It’s never free, it’s a mandatory bundle. But companies only put one line item on the receipt, never refer to the primary component separately but instead conflate its name and idea with the bundle, and when forced to de-bundle (usually) bump the primary component’s price to compensate, and people buy it: “the EU took away my charger!”
Chargers don’t change quickly. If I lost my charger from 2019, the ideal replacement in 2025 would be literally exactly the same model—and mine still works like new and looks good. I have nothing to gain from buying a new charger.
We should be cheering the EU for ending an abuse that the US has long failed to.
Also, it still bundles a USB-C to MagSafe 3 cable.
I mean, every laptop needs a charger.
If you sell your old laptop when you buy a new one, you generally sell it with old charger. And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
There's a reason they generally make sense to bundle. Especially with laptop chargers, which provide a whole lot more power than some random little USB-C charger you might have. Sometimes letting the free market decide actually gives customers what they want and find most useful.
> If you sell your old laptop when you buy a new one, you generally sell it with old charger.
Sounds like a symptom of incompatibility. I’ve only ever included the charger when it was specific to the laptop.
> And different Apple laptops take chargers of different maximum watts (they're compatible but not optimal), so they're not all the same anyways.
Chargers automatically provide whatever power level is needed, up to their max, and charging power isn’t the steady tick upward we’re used to elsewhere. The MacBook Pro did get a faster charger a few years ago, relegating old ones to that “compatible but not optimal” state, but meanwhile MacBook Air chargers got slower, and most releases didn’t change the charger. Certainly there are sometimes benefits to buying a new charger, but it happens much less often than new device purchases, and even when there are benefits purchases should still be the customer’s choice.
> Sometimes letting the free market decide actually gives customers what they want and find most useful.
I agree, but “free market” doesn’t mean lawlessness, it means an actual market that’s actually free. Actual market: companies compete on economics, not e.g. violence or leverage over consumers. Actually free: consumers freely choose between available options. Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
> Bundling is a very classic example of an attempt to circumvent free market economics, using the greater importance of one choice to dictate a second choice.
Only when there's no competition and you can use that to abuse market power.
But competition for laptops is strong. Most consumers want their laptops to come with a charger, even if you personally don't. That's why they're sold that way.
Like, nobody says the free market is failing because Coke forces me to buy carbonated H2O along with their syrup at the grocery store. The market prefers it when they're bundled.
Don't forget the environmental impact of a smaller box. The box will probably be less than half as thick, doubling shipping efficiency. These are air freight, so the CO2 impact is not negligible.
I'll take the discount and use one of my 12 existing USB-C chargers.
There are more 90W-capable USB-C chargers in my home than there are laptops. I certainly don't need another one. Honestly I'd be fine for them to just remove the box altogether and use paper envelope like Steve Jobs did once.
>Don't forget the environmental impact of a smaller box
Compared to the marginal environmental impact to source materials, build hardware and parts, assemble, ship, stock, and transport to customer each unit, the box could be 10x larger and it wouldn't make a dent.
> ship ... the box could be 10x larger and it wouldn't make a dent
This is not how shipping works.
A larger box, even by 1 inch on any direction, absolutely makes a huge difference when shipping in manufacturing quantities. Let's not pretend physical volume doesn't exist just to make an argument.
10 planes flying with MacBooks == much different than 1 plane (in other words, when you 10x the size of something, as you suggest, it does actually have a huge impact)
The point being made is "it's not the paper fr the box that's the issue".
A smaller box allows more to be carried. But if we go that route, it's trivial to ship them without any box and box them domestically - and that's a 2-3x volume reduction right there.
> it's trivial to ship them without any box and box them domestically
Ah yeah I can't imagine any scenario where this could go wrong
Like man in the middle attacks
Replacement/fake products
... or you know, damage? Boxes provide... protection.
> it's trivial
Anytime you catch yourself thinking something is trivial, you're probably trivializing it (aka think about it more and you'll probably be able to think of a dozen more reasons packaging products is the norm)
Bizarrely you can only select a new adapter during the configuration if you select 24GB RAM or higher
Prices are about 65 EUR for a 70W (tested DE + CH)
The EU law states they must provide an SKU without an adapter - i.e. they're still allowed to offer one with a power adapter.
Not sure about that. I never use my official charger and the magsafe cable but man, how did we arrive here. Some things just belong to a laptop.
My mbp came with a 140w charger - which I never use.
> Apparently, in Europe, the box will not contain a charger [1]. This is absolutely mind-blowing to me.
Same, for a laptop??? Really? Wild. You can charge these with USB-C chargers too.
Base 14" MBP M5 prices without VAT or sales taxes:
Germany: 1758 USD (1512 EUR) without charger.
US: 1599 USD with 70W charger.
This feels like is an insult.
Dropping the price was nice. They could have gotten away with a slight reduction in price, and a coupon inside to send away for a "free" charger, and then bask in the millions who couldn't be bothered to do it.
What really bugs me, is the huge performance gains are against the M1 and an (5-7yo chip) Intel Mac, that from my own memory had throttling and overheating issues. While not as impressive, I'd really appreciate if they simply showed the generational gains, or actual charts against several previous generations.
I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
> I'm still pretty happy with my 16gb M1 Air, but it would be nice to know some closer to real world differences.
I’m confused — they made a comparison that is directly relevant to your situation and you don’t like it?
Most people with an M4 won’t be looking to upgrade to an M5. But for people on an M1 (like you) or even an older Intel chip, this is good information!
I think they're trying to decide whether it's worth jumping all the way to an M5 or whether they'd rather just get an M4 or M3 at discount.
You can't expect Apple to make an argument against their own chips... you're asking them to admit that they are making ~20% a year improvements when they want buyers to think it's a multiples-of-X improvement.
How did we already get to no-one being impressed by 20% better PER YEAR already.
Macs barely got faster for ages with Intel - they just got hotter and shorter on battery life.
20% per year is a doubling every 4y. That is awesome.
> How did we already get to no-one being impressed by 20% better PER YEAR already.
When has 20% been impressive? When Intel to M1 happened, the jump was huge ... not 20%. I can't think of anything with a 20% jump that made waves, even outside of tech.
When I used to do user benchmarking, 20% was often the first data point where users would be able to notice something was faster.
4 minutes vs 5 minutes. That's great! Kind of expected that we'll make SOME progress, so what is the low bar... 10%? Then we should be impressed with 20?
People aren't upgrading from M1, M2, M3 in numbers... so I don't think it's just me that isn't wow'd.
No WiFi 7 and WiFi 6E only is annoying. Especially for what they are charging. And Bluetooth 5.3, Their Pro Mac are slower than their iPhone Pro.
SSD has double the speed. I guess they say this only for M5 MacBook Pro, because the previous M4 has always had slower SSD speed than M4 Pro at 3.5GB/s. So now the M5 should be at 7GB/s.
I assume no update on SDXC UHS-III.
The non-pro/max chipped MBPs have always been a little 'lower spec' in several regards. There used to be a little more separation though, with the non-pro chips available only in the Air & 13" MBP, but back then people complained about Apple having 'too many models'...
I suspect the M5 Pro/Max chipped MBPs will bring some of these improvements you're looking for.
What use case do you have (or anticipate having) for WiFi 7 out of curiosity?
With NVMe NASes and 5Gbit, 8Gbit and 8Gbit FTTH available for reasonable price in many places, it's easy to saturate any WiFi connection by just downloading stuff (games, AI models, etc), backing up files, or accessing your files on NAS (and editing videos straight off NAS is recently trendy).
Anyone know when to expect the M5 Pros? I am on a base 16gb M1 and struggling hard in daily workloads. I am often running at 20gb of swap memory usage.
I don't really use local LLMs but think 32GB RAM would be good for me... but I am so ready to upgrade but trying to figure out how much longer we need to wait.
First rule of mac world is get the most memory you can afford.
I got the cheapest m1 pro (the weird one they sold thats binned due to defects) with 32gb ram and everything runs awesome.
Always get the most ram you can in mac world. Running a largish local LLM model is slowish but it does run.
A mac out of memory is just a totally different machine than one with.
probably because most of the devs building the software are on the highest ram possible and there is just so much testing and optimization they dont do.
Part of me misses my OG base 14" M4 Pro. The battery on that thing was absolutely phenomenal - literal 12-14+ hours of real-world use. Not so much on the 14" M1 Max (64GB) that I upgraded to after about 2 yrs.
'Real-world idle' efficiency on the newer chips is the main reason I've got the (slight) itch to upgrade, but 64GB+ MBPs certainly don't come cheap.
Rumors suggest they might be early next year, or likely by spring.
There was a 6 month gap between the M4 and M4 Pro, so maybe a while.
This gap makes no sense to me. I wonder if Apple is just leaning into this cycle because it's easier to make M5s than more advanced processors, so you can sell this sooner?
From a buyer's perspective, I don't like it at all.
M series chips are ridiculously massive, as Apple apparently does not want to transition to chiplets, so they can’t easily compose CPU. Thus refining the process and improving yields on the smaller parts probably makes sense.
As an other example the current ultra part is the M3, and it was released early 2025, after even the M4 Pro/Max, and a good 18 months after the M3 was unveiled. We might not see an M4 Ultra until 2027.
Different Chip SKUs are often a TON of work. By trying to release all of them at the same time, you'll have a chip pipeline where you need tons of work, all at the same time, all in the same stages of the process. By staggering them, you spread this work out across the year.
it says "up to 14 hours more than intel based mpro", which means intel was designed to last 10 hours which matches what was advertised https://web.archive.org/web/20201109092341/https://www.apple...
we went from 10 hours to 24 hours in 5 years - impressive
i wonder why they advertise gaming on the laptop, anyone plays anything Meaningful on macbooks?
I play absolutely everything on my M1 Macbook Pro. Through Crossover, basically every Windows game runs fine. I used to check ahead of time before buying a game, but it's so good I now kind of just assume games work.
A NVIDIA 2080 graphics card from 2018 still surpasses the M5 for gaming. The M5 Pro coming early next year will likely finally catch up with the 8-year-old 2080.
I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles. Most games released in the last year or two don't run well on my 2080 test system at anything approaching decent graphics.
A 2080 is about the same performance as a 5060 and every game is going to be able to run on a 5060. You might not be running it at 4K Ultra with ray tracing enabled but you should be able to run at like 1080p High or better.
Whether or not the M5 GPU is actually capable of that level of performance or whether the drivers will let it reach its potential is of course a completely different story. GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
> A 2080 is about the same performance as a 5060
A 5060 outperforms a 2080 by roughly 20% on most titles, across the board, not cherry-picking for the best results. They are not about the same.
> you should be able to run at like 1080p High or better
This is disconnected from reality. 1080p low/medium, some games are playable but not enjoyable. Remember, I actually have a 2080, so I'm not just guessing.
> GPU performance is hard to estimate based on raw specs, you just have to run benchmarks and see what you end up with.
Rich coming from someone who claims a 7 year old graphics card is "about the same" as a card which has 2.5x better RayTracing, has 3x faster DLSS, faster VRAM, and much better AI capabilities. The 2080 can't even encode/decode AV1...
A reminder that the majority of what people actually play isn't "modern AAA titles": https://steamcharts.com/top
> I'm happy to hear your games work well for you, but it sounds like the games you're playing aren't demanding compared to modern AAA titles.
That's why I made the specific distinction in the comment you're responding to
When a $599 Windows laptop with a 3060 can play AAA titles and your $1599 MBP can't, I wouldn't normally call that great for gaming.
I would guess they are mostly talking to game devs for now, but man in a few years if Apple can get me to throw out my windows rig that me and I imagine many others have around just for gaming I wouldn't hate that!
The gaming world is so deeply ingrained with Windows technologies. Even with the GPK from Apple, I don't see the mods and patches that some Windows players enjoy.
I do almost all of my gaming on an M3 MacBook Air. It’s great for games. I’ll sometimes hop on Windows for titles unavailable on the Mac, but increasingly I just skip them if they aren’t on Steam for MacOS.
I get that it is good for some games, but when people say "gaming PCs" on Windows, they usually mean AAA titles. The stuff on endcaps at BestBuy for sale for PC and console. Those games won't run well on Macs unless you spend insane amounts on a Max or Ultra variant.
The M5's GPU specs seem to put it near a high-end NVIDIA card from 2018. Impressive as all get out for a power-friendly chip, but not really what I think of when I hear "good for gaming"
Baldurs Gate 3 runs great on m4 pro at 1080p (I’m on a Mac mini though).
mac mini is amazing, even though happy to hear people play games on macbooks
Most of the games I play (League of Legends, Civ, Factorio) work really well on my MacBook.
As an M1 owner seriously tempted by the hardware, seriously put off by Tahoe…
Am I remembering right that the previous 14" MacBook Pro started at $1399 (and seems to be no longer available?), so this is a $200 price increase?
(I had just been looking at macs a few weeks ago, and had noticed how close in price macbook pro and macbook air were for same specs -- was thinking, really no reason not to get pro even if all I really want it for is the built-in HDMI. They are now more price differentiated, if I am remembering right).
That would be about a 15% increase, which is probably in the ballpark to be explained by tariffs (either existing or anticipatory)?
Are there any markets where they are 15% cheaper due to not having the US's tarriffs?
Good question, although I don't think they would price them differently because the Trump administration has openly signaled hostility toward private companies who would transparently pass on tariff costs and Apple has openly signaled subservience to the Trump administration.
Someone I know saw the 14" starts from $1,599 and the 16" starts at $2,499, and quipped "The most expensive 2 inches ever."
However, it is not just because of the larger display.
M5 14" starts at:
10-Core CPU
10-Core GPU
16GB Unified Memory
512GB SSD Storage
M5 16" starts at:
14-Core CPU
20-Core GPU
24GB Unified Memory
512GB SSD Storage
So it's the cost of 4x more core CPU, 10x (double) the core GPU, and +8GB memory.
The 16" doesn't offer the M5(yet), rather the M4 Pro and Max CPUs. Difference also is higher number of performance cores vs efficiency cores and memory bandwidth is significantly higher in the M4 lines(273 and 410 GB/s) versus the M5(153 GB/s).
Where did you get these specs from? The page linked in the OP says the 16 in is only available in M4.
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
Everyone buying their high end gear is buying something waiting to be refreshed now.
Hasn't that been the case throughout the industry for the last two decades now? Back when Intel was still on TikTok, the low powered laptop chips were always first, then mainstream desktop, then workstation and server roughly a year delayed. Maybe mistaken, but seems to make sense, if you mainly offer monolithic chips, you'd want to start with a smaller Die size to better leverage the process.
AMD is somewhat of an exception/unique case though, having chipsets and monolithic depending on the use case and console/semicustom offerings, so that doesn't map fully.
Also, let's not forget in Apples case, that they actually go phone first, the Air+iPad, then Pro and finally Studio. Feel that the lower end devices should priority personally though, efficiency gains are more valuable in connected devices with limited space for batteries over my 16 incher with 100wh.
Course, would be nice if we just got the entire range updated at once, but I doubt even Apple could pull such a supply chain miracle off, even if they bought all of TSMC and the entire island to boot...
> Everyone buying their high end gear is buying something waiting to be refreshed now.
Most of their buyers aren’t buying the highest end parts. Those are a niche market.
Focusing on the smaller parts first makes sense because they’re easier to validate and ship. The larger parts are more complicated and come next.
I’m guessing this is so they optimize processor yields as manufacturing improves
Smaller chips means more of a wafer is usable when a defect exists
+1 on this... it also gives them more opportunity to work out any issues when piecing the larger chips together while catching any post-production issues on the simpler, lower end parts before they hit bigger customers.
Well here’s the thing, M5 is (probably) a big A19, M1 was a big A14, and so forth. The whole thing of apple silicon is that it’s large phone chips (optimized for performance per watt) rather than small workstation chips (performance at all costs)
A Fab ramping up a new process node is hardly a new thing.
The standard practice is to start by producing the chips with the smallest die size.
Is this because they know some whales that want the +1 model are going to jump at the opportunity to buy whatever is on the market, and then buy again when higher-range models are released?
Apple knows their sales numbers, so I imagine is that they know the base model will sell the most quantity. Having it out now means more sales at the highest MSRP before talk of the next model release.
Buyers who walk into an Apple store for a base MacBook Pro will wait if they hear a new model is coming out. So if you have a buyer basing purchases on the generation number, it makes sense to launch that model as soon as possible.
Pro/Max buyers generally are checking into specs and getting what they need. Hence the M2 Ultra still being for sale, for some niches that have specific requirements.
Could be yield related. Do the non top end products run pro chips that didn’t pass testing fully, and have parts disabled?
…or they are still working on fine tuning/testing production of the world’s most technologically difficult thing to mass manufacture?
It takes 3 months to manufacture a chip end to end. If you find a hardware bug once complete, you have to start over and you end up with a three month delay.
Looks like the Pro and Max will be on a three month delay.
If you find a hardware bug that late, you're not fixing it and you're looking for chicken bits to flip. It is /extremely/ expensive to validate changes and remake the masks.
Apple’s chip release schedule is so borked. It should be High end Pro and Studio first and then iPad, Air, Mini and downgraded Pro. Why they release the iPad and Low End Pro is beyond me.
People in the U.S. are starting to think about their Christmas shopping lists right about now.
wait I can’t have M5 with 64GB ram? highest is just 32GB which is ridiculous!
M5 Pro and M5 Max will come later with higher RAM support.
This has been their staggered release strategy for a while.
I thought the Pro and Max usually get announced at the same time and the Ultra comes later...
Correct. Sometimes much later.
Still no M4 Ultra Studio available.
they have historically had three tiers of cpus:
- normal - pro - max
pro and max had way more cores and gpus and supported way more ram. today's release is the basic version of the new cpu; if you want more ram you can get the m4pro or m4max based MacBook Pros, or wait for the M5pro/max to come out.
so, macbook pro m5 normal, macbook pro m5 pro, macbook pro m5 max. I see M4 still in the offer, which costs more. No wifi 7 on m5 (normal or not). 32GB these days, unified or not, in a pro machine.. I haven't paid much attention to macs over the past few years but I wonder what would Steve Jobs say about this shit.
It seems like only the M5 base chip is available at the moment. The M5 Pro and Max, whenever those are released, will presumably have higher limits.
and its unified
I've been a Windows fan forever, but the new Mac hardware is making it hard to remain and it's about time for a new laptop... can't get a good Windows installed on these chips like you could on the Intel-based ones, only virtualized.
Virtualized Windows on M chips is quicker than non-virtualized Windows on your average corporate laptop in my experience.
ARM Windows still has so many pain points, depending on your niche.
Wow maybe it's worth a try
Lowest tier comes with 16 gb of memory, same memory size with lowest M. air, why Apple?
It looks like the highest tier is 32GB, which really surprised me. I guess we'll have to wait for the M5 Pro / M5 Max for more memory than that.
Bad news for anyone who buys the M5 MacBook Pro as an "AI" machine and finds it can't fit any of the more interesting LLMs!
It has always been this way. Base M1's max RAM was 16GB, M2/M3's was 24GB, M4's was 32GB.
Base m1 was like 4-5 years ago. did we have that much ram with oldest macs too, 30 years ago? 4-5 years at same base RAM is incredibly cheap behaviour from Apple. Some phones are literally close to that RAM now.
Base M1 had 8GB RAM base and 16 max. Where are you getting “same base ram”?
%100 ram growth in 5 years still very slow and cheap.
I think 8GB was also what we had in ... 2012? Or am I wrong. Memory has been going so slow.
You could (unsupported) run 16gb ram on 2010 rMBP models, back before it was soldered on. Worked great, not to mention swapping the spinning drive for an SSD.
At this point, I get the soldered on ram, for better or worse... I do wish at least storage was more approachable.
First Retina MacBook Pros 13" were with 8 GB base memory. That was either 2013 or even 2012, so, yeah.
Indeed. RAM is the tool of planned obselecence (and profiteering for that matter).
Get an ad blocker and then get all the people writing Java/Electron apps to fix their memory usage and you'll be good.
Exceptions apply to those running local LLMs.
They are running out of ways to differentiate their products.
Wait what!?!? My MacBook M1 has 64GB of memory for crying out loud.
M1 maxed out at 16 GB, if you have 64 it’s an M1 Max.
Wait what!?!? You may have a MacBook Pro M1 Max for crying out loud.
The Pro sales page says their RAM is unified, which is more efficient than traditional. Anyone have a concept by how much more efficient unified RAM performs vs RAM?
Their sales copy for reference:
"M-series chips include unified memory, which is more efficient than traditional RAM. This single pool of high-performance memory allows apps to efficiently share data between the CPU, GPU, and Neural Engine.... This means you can do more with unified memory than you could with the same amount of traditional RAM."
The fact that it's unified just means it's shared between CPU/GPU usage which can be a good thing. A lot of the performance comes from more channels and a more stable distance from the CPU itself... Getting really fast performance from RAM is more difficult with a detachable interface in the middle, and longer traces.
Still not the fastest ram, that they use for dedicated GPUs, but faster than most x86 options.
RAM has always been unified on every M series CPU.
It just means it's shared between GPU and CPU. Has its advantages in specific workloads, but dedicated super fast GPU RAM usually is better. Everything else in this statement in marketing bullshit and Apple trying to look like they invented the wheel and discovered fire.
It's unified on the Air too though?
yes, all Mx processors have unified ram
Because the MacBook “pro” with a base (not pro or max) M is and has always been an air with better cooling.
Search for memory wall. Moore’s law died a decade ago for DRAM
Well, they could continue the 8gb joke so let's appreciate the fact that they finally switched to 16gb base models (and similarly stopped the 128GB SSD madness, these models were outdated when bought).
For perspective I have a 12 year old MacBook with 8 gigs of ram and it’s still perfectly usable for all the things I do on it. If you need more RAM because you are video encoding, compiling, or gaming (why!?) then you aren’t a basic consumer.
I’m not trying into be a fanboy and maybe it’s a little bit “cope”, but apple has always put as much RAM as is necessary for the computer to work—and not a lot more—in their base models.
The $1599 M5 Macbook Pro: Good enough for a guy who thinks a 12-year-old MacBook with 8 gigs of ram is "still perfectly usable"
:)
I know it’s silly, but I think I represent over 90% of apples customers in that way. I just need something that reads emails and shows me porn.
> I know it’s silly, but I think I represent over 90% of apples customers in that way
You're not silly, you're just able to see reality.
Apple knows who is buying the bulk of their computers, and it isn't power users ... most people buying computers don't have a clue what RAM is even used for.
I'd hit beachballs, but macOS balances 8GB of RAM fine even with Tahoe for regular users
That 90% are perfect candidates for the even cheaper Macbook Air lineup.
[dead]
And a "pro" computer that comes with half a tb of storage by default with a $200 premium for another 0.5 tb of storage. Oof. Just gross.
I know people complain at every release. But I look at the three choices presented and they are all disappointing to me. It's a huge turnoff to see the only initial differentiator presented to be a choice between "measly" amounts of RAM and storage to "barely acceptable" amounts.
To get even close to the specs on my Surface Pro I'd have to hit the configurator and spend at least $1000. Even more to hit the config of my work issued HP notebook.
Probably great, but when the hell are they going to do another damn colour. Hoping by the time I upgrade from the M4 Pro they'll have a green version and a cell modem.
I wish they'd bring Space Gray back. Not a huge fan of Silver, and the 'Space Black' apparently tends to show smudges more.
At least for now, seems to be available only for the 14" MacBook Pro. I want a 16" M5 MacBook Pro so I will wait ...
Only the cheapest MacBook get M5, the rest stay with M4 Pro and M4 Max? what's going on with that lineup?
Pro/Max rollout tends to lag behind the 'base' by about 6 months
It used to be a little less 'weird' when the base M-chips were only available in the Air and 13" MBP.
Because there is no M5 Pro and Max yet obviously.
Looks premium, eco-responsible, powerful, just like the M4. And the M3. And the one before. Does anyone know what exactly changed this year?
https://www.apple.com/mac/compare/?modelList=MacBook-Air-M4,...
- m4 -> m5, same core number and distribution, "neural accelerators", higher memory bandwidth
- max storage increased from 2 to 4TB (and probably an extra kidney in price)
Everything else is strictly identical.
The marketing blurb claims 2~3x the performances of the M4 for AI stuff (I assume that's the neural accelerators), 20% for CPU tasks (compiling), 40~80% for GPU tasks (gaming and 3D rendering).
This is great but it's apples to oranges until they have Pro & Max variants we can directly compare against the M4 line.
Comparing the models which correspond exactly is apples and oranges? In what wonky alternate reality do you live?
Not to mention the M4 pro and max released 6 months after the M4. If that holds for M5, it won’t be this year.
4TB was available before, maybe only coupled with a Max CPU, which seems not available yet for M5.
I’m comparing comparable configurations. You can get 128GB ram and 8TB storage on an m4 max, and 512GB ram and 16 TB storage on an m3 ultra. Neither is relevant to the M5.
The number, which means you should upgrade.
Wanted to see if it's worth upgrading from the M1 Pro, but I don't see the M5 Pro in the compare dropdown. https://www.apple.com/macbook-pro/compare/
TBH, I have an M2 pro (personal) and an M4 Pro (work) and I have never been able to tell the different in day to day use. That said, the only really intensive workload I have is photography/videography batch post-processing and I haven't tested that on my work machine. I'm disappointed Apple hasn't published any benchmarks comparing M5 to various M4 (or M3) variants.
They've only updated the base M5. Expect the Pro and Max updates to come early next year.
I find it annoying that now I want 2 macbooks, one tuned for local LLMs and the other tuned for light and big screen (Air 15).
I never really used the local LLMs since I can always go to claude but that's always on my irrational brain to spend irresponsibly for that.
I read the title as "MS MacBook Pro" and prepared for a surprise.
The lack of WiFi 7 is disappointing. 6E is fine but by now I'd expect 7 in new computers.
Especially considering the M5 iPad pro has WiFi 7.
Sounds like maybe they didn't want to try and fit their new N1 chip this go around so they could re-use some components? MacBook still has the same broadcom chip. Or for a pro differentiating feature when the M5 Pro/Max comes out later. There's a rumored MBP re-design, so I'm guessing we'll see it then along with it having the N1 for WiFi 7.
Good chance they'll introduce it with the upcoming M5 Pro/Max; the non-pro/max devices always tended to be a little lower spec all around.
Are these custom boards or just mini pcie network cards you could swap out?
This is Apple: the last time they shipped a pcie/replaceable wifi card was thirteen years ago on the Mid-2012 non-Retina MacBook Pro.
Even pre-Apple Silicon, it's been a decade since users could upgrade MacBook's RAM or internal storage.
It is in the new ipad pro as part of the new Apple chipset, so presumably coming to other machines later.
homoconsomator needs bigger numbers even if he acknowledges smaller number is ok.
What do you do on wifi that requires more than 10gb per seconds... on a laptop, you'd fill up the base model ssd in under a minute of download
I run our office IT, and WiFi 7 is just better at managing congestion. We have a floor in a busy building and 5Ghz is chaos. 6E is fine, it's just strangely old for a company like Apple.
Sad to release without refreshing the high end line on 16” MBP. I worry they nerfed the 14” MBP to ensure the M4 16” retained better specs to not make the discontinuity worse. Otherwise the 14” outperforming the more expensive 16” would be uncomfortable.
This is only the M4. Not the M4 Pro or Max, and the 16 has never gotten the base chip.
Isn't the screen size difference basically enough between these two? I can't see why the 16" would need more performance, some ppl just want / can carry large computers with them while some other prefer to have something as small as possible.
It’s more a statement on price and the assumption the more expensive one with the “more capable” chip like the MAX would be expected to not be less performant than any in the lineup. It would be a disappointment, especially for me as I’m about to buy a 16” in November regardless, to be a generation behind while paying more, and it would be not unusual for product reasons to nerf the lower prices chip to ensure it didn’t canabalize the more expensive models sales.
I have to say if I had any choice I would delay my purchase until the 16” catches up rather than buying a generation behind. If I see specs saying M5 14” is more performant for my workloads than my more expensive 16” I’m even more motivated to delay. Most product managers would be aware of these things.
I can see why that sounds sensible, but my personal obsevations are that heavy duty power users almost universally prefer the bigger screens, and those people also want the highest level settings. Most people I know who want smaller screens are not serious power users.
I can see an overlap with people who want smaller computers who also want max power, but I just would not believe that is a significant group. (again, all personal observations)
More pixels? Thats the only reason I can think of. 13/14 inch is what I tend to go for since I use my laptop as a desktop 80% of the time. 16 is really too big for my needs.
I also think the 15 inch MacBook Air filled the non-power-user-but-likes-big-screen niche.
If I buy in the US and use it in the EU – will the Apple Intelligence work?
I would be happy to sacrifice the EU keyboard and have the AI instead :-)
Unless they’re now doing iPhone-level parts locking, getting the right keyboard would be an eBay purchase away.
why is apple releasing a MBP with the old gen of pro chips (m4 pro)?
The M5 Pro refresh will come later. M4 Pro parts are still available until then.
> The M5 Pro refresh will come later.
Did they announce this or are you speaking for Apple?
They release the Pro and Max parts after the base part.
This has been their release strategy for past generations.
Does anyone have a guess on when they could be releasing a potential M4 Ultra Mac Studio?
I think they said M4 Ultra Studio is not going to happen, have to wait for M5 Ultra...
...they're not. This is a release of a 14" with the base M5, alongside the other existing M4 Pro/Max models.
The Pro/Max rollout tends to lag behind by about 6 months.
Related:
Apple M5 Chip
https://news.ycombinator.com/item?id=45591799
Apple's announcement: https://news.ycombinator.com/item?id=45591873
Does it have full sized cursor keys? PgUp/PdDown/HomeEnd keys? If not, its a fashion accessory, its definately not a PROductivity machine used by PROffessionals.