I really respect Apple's privacy focused engineering. They didn't roll out _any_ AI features until they were capable of running them locally, and before doing any cloud-based AI they designed and rolled out Private Cloud Compute.
You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
Mac OS calls home every time you execute an application.
Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU.
If you send your computer/phone to Apple for repair you may get back different physical hardware.
Those things very much highlight that "your" Apple hardware is not yours and that privacy on Apple hardware does not actually exist, sure they may not share that data with other parties but they definitely do not respect your privacy or act like you own the hardware you purchased.
Apple marketing seems to have reached the level indoctrination where everyone just keeps parroting what Apple says as an absolute truth.
They send a hash of the binaries/libraries, and generate a cache locally so it's not sent again. That helps stop you from running tampered-with binaries and frameworks. No user-personal data is sent.
There is no evidence at all that they are trying to ensure you can only run things from the App Store - I run a whole bunch of non-app-store binaries every single day. To make that claim is baseless and makes me de-rate the rest of what you write.
There is always a trade-off between privacy and security. This still falls well under the Google/Android/Chrome level, or indeed the Microsoft/Windows level with its targeted ads, IMHO.
My understanding is that they keep a local file with known malware signatures, just like the malware scanners on every other platform.
> macOS includes built-in antivirus technology called XProtect for the signature-based detection and removal of malware. The system uses YARA signatures, a tool used to conduct signature-based detection of malware, which Apple updates regularly
Xprotect is a blacklist that runs locally and is rarely used.
The phone home functionality is notarization, where apple does a network call to check that the signature on an executable actually came from apple’s notarization process. It is in essence a reputation system, where developers must be on good terms with apple to have the ability to notarize and get a smooth install experience.
1. Most users are not capable of using general purpose computing technology in a wild, networked environment safely.
2. Too many people who matter to ignore insist, "something must be done."
3. And so something shall be done.
4. Apple is navigating difficult waters. As much as I disapprove of how they have chosen a path for iOS, the fact is many people find those choices are high value.
5. I do, for the most part, approve of their choices for Mac OS. I am not sure how they prevent malicious code without maintaining some sort of information for that purpose.
6. We are arriving at a crossroads many of us have been talking about for a long time. And that means we will have to make some hard choices going forward. And how we all navigate this will impact others in the future for a long time.
Look at Microsoft! They are collecting everything! And they absolutely will work with law enforcement anytime, any day, almost any way!
I sure as hell want nothing to do with Windows 11. Most technical people I know feel the same way.
Screenies every 3 to 5 seconds? Are they high? Good grief! Almost feels like raw rape. Metaphorically, of course.
Then we have Linux. Boy am I glad I took the time way back in the 90's to learn about OSS, Stallman, read words from interesting people, Raymond, Perkins, Searles, Lessig, Doctorow, many others!
Linus did all of tech one hell of a solid and here we are able to literally dumpster dive and build whatever we want just because we can. Awesome sauce in a jar right there
, but!
(And this really matters)
...Linux just is not going to be the general answer for ordinary people. At least not yet. Maybe it will be soon.
It is an answer in the form of a crude check and balance against those in power. Remember the "something shall be done" people? Yeah, those guys.
And here we are back to Apple.
Now, given the context I put here, Apple has ended up really important. Working professionals stand something of a chance choosing Mac OS rather than be forced into Windows 11, transparent edition!
And Apple does not appear willing to work against their users best interests, unless they are both compelled to by law, and have lost important challenges to said law.
If you want that, your choices are Apple and Linux!
7. Open, general purpose computing is under threat. Just watch what happens with Arm PC devices and the locked bootloaders to follow just like mobile devices.
Strangely, I find myself wanting to build a really nice Intel PC while I still can do that and actually own it and stand some basic chance of knowing most of what it doing for me. Or TO ME.
No Joke!
As I move off Win 10, it will be onto Linux and Mac OS. Yeah, hardware costs a bit more, and yeah it needs to be further reverse engineered for Linux to run on it too, but Apple does not appear to get in the way of all that. They also do not need to help and generally don't. Otherwise, the Linux work is getting done by great people we all really should recognize and be thankful for.
That dynamic is OK with me too. It is a sort of harsh mutual respect. Apple gets to be Apple and we all get to be who we are and do what we all do with general purpose computers as originally envisioned long ago.
We all can live pretty easily with that.
So, onward we go! This interesting time will prove to be more dangerous than it needs to be.
If it were not for Apple carving out a clear alternative things would look considerably more draconian, I could and maybe almost should say fascist and to me completely unacceptable.
As someone who cut his teeth on computing in the era you refer to, I have a small disagreement about Linux (especially Ubuntu) in your statement.
Apple is priced beyond the reach of many "ordinary people" especially outside the western markets. A cheap (perhaps after market) laptop with Ubuntu on it (often installed by the seller) is something that has been getting a lot of traction among regular users. Most of the things they do are via. a browser so as long as Chrome/FF works, they're good. They often install software that undermines the security that the platform natively offers but still, it's a pretty decent compromise.
> I run a whole bunch of non-app-store binaries every single day
if you are in the US, you need to either register as a developer, or register an apple id and register your app to run it for a week. that's how you run non-app store code. Both of those require permission from apple.
This is completely incorrect. You can download a random binary and execute it. You will get a warning dialog saying it’s not signed by a known developer. You are free to ignore that though.
Depends what you mean by fiddling. But I'm in the process of switching to mac from Linux because my new job has forced it upon me.
I tried installing "Flameshot" via homebrew and it wouldn't run until I went into Finder, right clicked it and clicked open. Luckily it's mentioned in their docs [0] or I would have never guessed to do this.
If I were you, I would relax. At least you are not being shoved onto Win 11.
And then think about that. Seriously. I did. Have a few times off and on over the years as we sink into this mess.
I bet you find an OS that does a bit more than you may otherwise prefer to prevent trouble. If so, fair call in my book.
Just how big of a deal is that?
Compared to Android, Windows 10 and tons of network services and such and what they do not do FOR you, and instead do TO you.
And you can run a respectable and useful installation of Linux on that spiffy Apple hardware when it gets old. So make sure it gets old, know what I mean?
As someone that just got out of a gig where I had to run Docker on MacOS - for the love of god, I would have done almost anything to use Windows 11.
Look - if I'm going to be treated like garbage, advertised to and patronized, at least let me use the system that can run Linux shells without turning into a nuclear reactor.
It’s not “a big deal” if the user knows about, but the phrasing in macOS is maliciously bad - I sent a build from my machine to a coworker and when they “naively” ran it, the pop up that came up didn’t say “this program is unsigned” it said “this program is damaged and will now be deleted” (I don’t remember the exact phrasing but it made it sound like a virus or damaged download, not like an unsigned program).
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU.
People have been saying this ever since Apple added the App Store to the Mac in 2010. It’s been 14 years. I wonder how much time has to go by for people to believe it’s not on Apple’s todo list.
> If you send your computer/phone to Apple for repair you may get back different physical hardware.
I happen to be in the midst of a repair with Apple right now. And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone. Aside from the occasional sticker, I don't have any custom hardware mods to my phone or laptop, and nor do 99.99% of people.
Can Apple please every single tech nerd 100% of the time? No. Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Why not both? Why can’t we have a good usability experience AND control? In fact, we used to have that via the Mac hardware and software of the 1990s and 2000s, as well as NeXT’s software and hardware.
There was a time when Apple’s hardware was user-serviceable; I fondly remember my 2006 MacBook, with easily-upgradable RAM and storage. I also remember a time when Mac OS X didn’t have notarization and when the App Store didn’t exist. I would gladly use a patched version of Snow Leopard or even Tiger running on my Framework 13 if this were an option and if a modern web browser were available.
NeXT was great and Mac OS X was also nice and had a lovely indie and boutique app ecosystem during the mid-to-late 2000s. Sadly, iOS stole the focus. However, the OP argues Linux usability is bad, which I think is an outdated POV. It really depends on your setup and usecases. For many development usecases, Linux is superior to macOS.
I run NixOS on a plain X11 environment with a browser, an editor and a terminal. It's really boring. For my favorite development stacks, everything works. Flakes make workflow easy to reproduce, and it's also easy to make dramatic setup changes at OS level thanks to declarativeness and immutability.
If you're interacting with other humans, or with the consumer internet, you'll run into thousands of situations where my default setup (macOS, Chrome) "just works," and your setup will require some extra effort.
You may be smart enough to figure it out, but most people (even many smart tech people) get tired of these constant battles.
Here's an example from earlier this evening: I was buying a plane ticket from Japan Air Lines. Chrome automagically translates their website from Japanese to English. Other browsers, e.g. Firefox, and even Safari, do not - I checked. Is there a workaround or a fix? I'm sure you could find one, given time and effort. But who wants to constantly deal with these hassles?
Another very common example is communication apps. Or any time you're exchanging data in some proprietary format. Would it be great if no one used proprietary formats? Yes! Is that the world we live in? No. Can I force the rest of the world to adopt open standards, by refusing to communicate with them? No.
The world has moved on from desktop environments to multi-device integration like Watch, Phone, AirTags, Speakers, TV and in that way Linux usability is certainly worse than MacOS.
Oh sort of. That is for sure a thing, but not THE thing.
I would argue people are being tugged in that direction more than it being simply better.
You can bet when people start to get to work building things --all sorts of things, not just software, they find out pretty quickly just how important a simple desktop running on a general purpose computer really is!
It could help to compare to other makers for a minute: if you need to repair your Surface Pro, you can easily remove the SSD from the tray, send your machine and stick it back when it comes repaired (new or not)
And most laptops at this point have removable/exchangeable storage. Except for Apple.
> remove the SSD from the tray, send your machine and stick it back when it comes repaired
Apple has full-disk encryption backed by the secure enclave so its not by-passable.
Sure their standard question-set asks you for your password when you submit it for repair.
But you don't have to give it to them. They will happily repair your machine without it because they can boot their hardware-test suite off an external device.
I get your point, but we can also agree "send us your data, we can't access it anyway, right ?" is a completely different proposition from physically removing the data.
In particular if a flaw was to be revealed on the secure enclave or encryption, it would be too late to act on it after the machines have been sent in for years.
To be clear, I'm reacting on the "Apple is privacy focused" part. I wouldn't care if they snoop my bank statements on disk, but as a system I see them as behind what other players are doing in the market.
I hear the point you're making and I respect the angle, its fair-enough, but ...
The trouble with venturing into what-if territory is the same applies to you...
What if the disk you took out was subjected to an evil-maid attack ?
What if the crypto implementation used on the disk you took out was poor ?
What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
The trouble with IT security is you have you trust someone and something because even with open-source, you're never going to sit and read the code (of the program AND its dependency tree), and even with open-hardware you still need to trust all those parts you bought that were made in China unless you're planning to open your own chip-fab and motherboard plant ?
Its the same with Let's Encrypt certs, every man and his dog are happy to use them these days. But there's still a lot of underlying trust going on there, no ?
So all things considered, if you did a risk-assessment, being able to trust Apple ? Most people would say that's a reasonable assumption ?
> even with open-source, you're never going to sit and read the code (of the program AND its dependency tree)
You don't have to. The fact that it's possible for you to do so, and the fact that there are many other people in the open source community able to do so and share their findings, already makes it much more trust-worthy than any closed apple product.
> What if the disk you took out was subjected to an evil-maid attack ?
Well, have fun with my encrypted data. Then I get my laptop back, and it's either a) running the unmodified, signed and encrypted system I set before or b) obviously tampered with to a comical degree.
> What if the crypto implementation used on the disk you took out was poor ?
I feel like that is 100x more likely to be a concern when you can't control disc cryptography in any meaningful way. The same question applies to literally all encryption schemes ever made, and if feds blow a zero day to crack my laptop that's a victory through attrition in anyone's book.
> What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
What if aliens did it?
Openness is a response to a desire for accountability, not perfect security (because that's foolish to assume from anyone, Apple or otherwise). People promote Linux and BSD-like models not because they cherry-pick every exploit like Microsoft and Apple does but because deliberate backdoors must accept that they are being submit to a hostile environment. Small patches will be scrutinized line-by-line - large patches will be delayed until they are tested and verified by maintainers. Maybe my trust is misplaced in the maintainers, but no serious exploit developer is foolish enough to assume they'll never be found. They are publishing themselves to the world, irrevocably.
What if the disk could be removed, put inside a thunderbolt enclosure, and worked on another machine while waiting for the other? That's what I did with my Framework.
Framework has demonstrated in more than one way that Apple's soldered/glued-in hardware strategy is not necessary.
I suppose so they can do a boot test post-repair or something like that. I have only used their repair process like twice in my life and both times I've just automatically said "no" and didn't bother asking the question. :)
With Apple FDE, you get nowhere without the password. The boot process doesn't pass go. Which catches people out when they reboot a headless Mac, the password comes before, not after boot even if the GUI experience makes you feel otherwise.
You need to trust the erasure system, which is software. This also requires you to have write access to the disk whatever the issues are, otherwise your trust is left in the encryption and nobody having the key.
That's good enough for most consumers, but a lot more sensitive for enterprises IMHO. It usually gets a pass by having the contractual relation with the repair shop cover the risks, but I know some roles that don't get macbooks for that reason alone.
>And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone.
except that isn't generally how factory repairs are handled.
I don't know about Apple specifically, but other groups (Samsung, Microsoft, Lenovo) will happily swap your unit with a factory refurbished or warranty-repaired unit as long as it was sufficiently qualified before hand -- so the 'replaced with a newer unit' concept might be fantasy.
I've seen a few Rossman streams with officially "refurbished" macbooks that were absolutely foul inside. Boards that looked like they had been left on a preheater over lunch, rubber wedges to "cure" a cracked joint, all sorts of awful shit. The leaked stories from the sweatshop that did the work were 100% consistent with the awful quality.
Admittedly this was a few years ago. Has apple mended their ways or are they still on the "used car salesman" grindset?
It would depend on a countries consumer laws. I used to work for AASP's in Australia and they definitely used refurished phones for replacements and refurished parts for the Mac repairs. Not everyone who uses this site lives in America...
With the sheer number of devs who use Macs, there is a 0% chance they’re going to outright prevent running arbitrary executables. Warn / make difficult, sure, but prevent? No.
The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
Meanwhile you have a minimal set of developers with the ability to run arbitrary programs, and you can go from there with surveillance on MacOS like having every executable tagged with the developer's ID.
The greater the distance between the developer and the user, the more you can charge people to use programs instead of just copying them. But you can go much further under the guise of "quality control".
> The strategy is to funnel most users onto an ipad-like platform
They make the best selling laptop in the world, and other most-popular-in-class laptops. If their strategy is to have people not use laptops, they are going about it funny.
> The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
And you know this how?
This reads like every macOS fan’s worst nightmare, but there’s zero actual evidence that Apple is going in this direction.
Further, there is a CRL/OCSP cache — which means that if you're running a program frequently, Apple are not receiving a fine-grained log of your executions, just a coarse-grained log of the checks from the cache's TTL timeouts.
Also, a CRL/OCSP check isn't a gating check — i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
Remember that CRLs/OCSP function as blacklists, not whitelists — they don't ask the question "is this certificate still valid?", but rather "has anyone specifically invalidated this certificate?" It is by default assumed that no, nobody has invalidated the certificate.
> i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
> Last week, just after we covered the release of Big Sur, many macOS users around the world experienced something unprecedented on the platform: a widespread outage of an obscure Apple service caused users worldwide to be unable to launch 3rd party applications.
Scroll down a little further on your link for confirmation of what the parent said:
> As was well-documented over the weekend, trustd employs a “fail-soft” call to Apple’s OCSP service: If the service is unavailable or the device itself is offline, trustd (to put it simply) goes ahead and “trusts” the app.
Even at the time people quickly figured out you could just disconnect from the internet as a workaround until the issue was fixed.
Both Windows and MacOS require that developers digitally sign their software, if you want users to be able to run that software without jumping through additional hoops on their computer.
You can't distribute software through the Apple or Microsoft app stores without the software being signed.
You can sign and distribute software yourself without having anything to do with the app stores of either platform, although getting a signing certificate that Windows will accept is more expensive for the little guys than getting a signing certificate that Macs will accept.
On Windows, allowing users to run your software without jumping through additional hoops requires you to purchase an Extended Validation Code Signing Certificate from a third party. Prices vary, but it's going to be at least several hundred dollars a year.
That's just your extremely limited experience (2 stores): homebrew runs a special command clearing up a bit so you don't get that notification, which does exist if yout download apps directly
It used to be that you could run any third-party application you downloaded. And then for a while you'd have to right-click and select Open the first time you ran an application you'd downloaded, and then click through a confirmation prompt. And macOS 15, you have to attempt to open the application, be told it is unsafe, and then manually approve it via system settings.
Huh? It hashes the binary and phones home doesn’t it? Go compile anything with gcc and watch that it takes one extra second for the first run of that executable. It’s not verifying any certificates
When I first run locally-built software I tend to notice XProtect scanning each binary when it is launched. I know that XProtect matches the executable against a pre-downloaded list of malware signatures rather than sending data to the internet, but I haven't monitored network traffic to be sure it is purely local. You can see the malware signatures it uses at /private/var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara if you're curious.
> not share that data with other parties but they definitely do not respect your privacy
not sharing my data with other parties, or using it to sell me stuff or show me ads, is what I would define as respecting my privacy; Apple checks those boxes where few other tech companies do
Agree. I recently went to an Apple store in Tokyo to buy an accessory. The Apple employee pulled up their store iPhone to take my payment (apple pay) and then asked me to fill out a form with my email address and there was a message about how my info would be shared with some company. I thought about going back and pretending to buy something else so I could film it. I questioned the store person, "It's apple supposed to be "Privacy first"". If it was privacy first they wouldn't have asked for the info in the first place and they certainly wouldn't be sharing it with a 3rd party.
Their repair policy, from what I can see, is a thinly veiled attempt to get you to either pay for Apple Care or to upgrade. I got a quote to repair a colleague's MacBook Pro, less than 2 years old, which has apparent 'water damage' and which they want AUD $2,500 to repair! Of course that makes no sense, so we're buying a new one ...
The problem with many self-repair people is they effectively value their time at zero.
I value my time realistically, i.e. above zero and above minimum wage. It is therefore a no brainer for me to buy AppleCare every ... single ..time. It means I can just drop it off and let someone else deal with messing around.
I also know how much hassle it is. Like many techies, I spent part of my early career repairing people's PCs. Even in big PC tower cases with easy accessibility to all parts its still a fucking horrific waste of time. Hence these days I'm very happy to let some junior at Apple do it for the cost of an AppleCare contract.
> The problem with many self-repair people is they effectively value their time at zero.
Back in 2010 Apple quoted me €700 for a topcase replacement because of shattered display glass. Instead I paid €50 for a third party replacement pane and did 15 minutes of work with a heat gun.
What's more, they fold most of the cost of the repair into the price of parts. So you can either get a replacement screen for €499 and install it yourself, or have it officially repaired for €559. This effectively subsidizes official repairs and makes DIY repairs more expensive.
Apple does extreme gouging with repairs, its hogwash to claim anything else.
A big problem with Apple Care is here in Thailand anyway you need to give them your computer for a few weeks. You have to wait a week for them to look at it. They won't even allow you to use it and then bring it back in a week.
How often do you actually need a repair from Apple? I used to buy AppleCare but stopped in the last few years and have yet to need any repairs done except a battery replacement on a 14 Pro that I was giving to family.
My hope is that the machine will work for a long while, like most of them do. In my case it’s a ~$1200 machine so I prefer to self-insure. I’m taking the chance that if it goes bad, I’ll pay to fix or replace it.
This makes sense, for me, when I do it on everything that I buy.
Because it feels like extortion. There was almost certainly no water damage caused by external factors: the user didn't spill anything on it and has literally no idea where the so-called water damage could have come from. I have heard anecdotally that this is their go-to for denying claims and it is difficult to argue against.
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU
What has the EU done to stop Apple doing this? Are Apple currently rolling it out to everywhere but the EU?
At the very least Apple are better than Microsoft, Windows and the vendors that sell Windows laptops when it comes to respecting user experience and privacy.
I switched to iPhone after they added the tracker blocking to the OS.
Everything is a tradeoff.
I’d love to live in the F droid alt tech land, but everything really comes down to utility. Messaging my friends is more important than using the right IM protocol.
Much as I wish I could convince everyone I know and have yet to meet to message me on Signal or whatever, that simply isn’t possible. Try explaining that I am not on Whatsapp or insta to a girl I’ve just met…
Also it is nice to spend basically no time maintaining the device, and have everything work together coherently. Time is ever more valuable past a certain point.
But why do we have to choose between convenient and open? Why are these companies allowed to continue having these protected "gardens"? I don't believe a free and truly open ecosystem for mobile devices would actually be less convenient than iOS or Android. If anything it would be vastly better.
They have big numbers. Big numbers tell that 95% of people would need to be in closed protected gardens rather than getting slaughtered by open source wolves.
Has it occurred to you that the stronger control of the ecosystem is a feature that supports the convenience and integration that's possible?
This is just the "Why not Linux desktop" argument from the past two decades. Sure, in theory it can be configured to do a lot of different things. But you're probably gonna have to work out the details yourself because the downside of theoretically supporting everything is that it's impossible to just have it work out of the box with every single scenario.
> Around one year ago, after joining the Blender Development Fund and seeding hardware to Blender developers, Apple empowered a few of its developers to directly contribute to the Blender source code.
I'm assuming similar support goes to other key pieces of software, e.g., from Adobe, Maxon, etc... but they don't talk about it for obvious reasons.
The point being Apple considers these key applications to their ecosystem, and (in my estimation at least) these are applications that will probably never be included in the App Store. (The counterargument would be the Office Suite, which is in the App Store, but the key Office application, Excel, is a totally different beast than the flagship Windows version, that kind of split isn't possible with the Adobe suite for example.)
Now what I actually think is happening is the following:
1. Apple believes the architecture around security and process management that they developed for iOS is fundamentally superior to the architecture of the Mac. This is debatable, but personally I think it's true as well for every reason, except for what I'll go into in #2 below. E.g., a device like the Vision Pro would be impossible with macOS architecture (too much absolute total complete utter trash is allowed to run unfettered on a Mac for a size-constrained device like that to ever be practical, e.g., all that trash consumes too much battery).
2. The open computing model has been instrumental in driving computing forward. E.g., going back to the Adobe example, After Effects plugins are just dynamically linked right into the After Effects executable. Third party plugins for other categories often work similarly, e.g., check out this absolutely wild video on how you install X-Particles on Cinema 4D (https://insydium.ltd/support-home/manuals/x-particles-video-...).
I'm not sure if anyone on the planet even knows why, deep down, #2 is important, I've never seen anyone write about it. But all the boundary pushing computing fields I'm interested in, which is mainly around media creation (i.e., historically Apple's bread-and-butter), seems to depend on it (notably they are all also local first, i.e., can't really be handled by a cloud service that opens up other architecture options).
So the way I view it is that Apple would love to move macOS to the fundamentally superior architecture model from iOS, but it's just impossible to do so without hindering too many use cases that depend on that open architecture. Apple is willing to go as close to that line as they can (in making the uses cases more difficult, e.g., the X-Particles video above), but not actually willing to cross it.
> Apple is well on its way to ensure you can only run things they allow via app store
I am totally ok with this. I have personally seen apple reject an app update and delist the app because a tiny library used within it had a recent security concerns. Forced the company to fix it.
No one is stopping you from using only the app store if you value its protection, so you need a more relevant justification of forcing everyone else to do so
Sure – Apple are trying to stop people who don't know what they're doing from getting hurt. Hence the strong scrutiny on what is allowed on the App Store (whether it's reasonable to charge 30% of revenue is an entirely different question).
People who are installing things using a terminal are probably (a) slightly computer savvy and (b) therefore aware that this might not be a totally safe operation.
Genuinely asking: are there any specifics on this? I understand that blocking at the firewall level is an option, but I recall someone here mentioning an issue where certain local machine rules don’t work effectively. I believe this is the issue [1]. Has it been “fixed”?
They're probably referring to the certificate verification that happens when you open any notarized application. Unless something changed recently, the system phones home to ensure its certificate wasn't revoked.
Yeah because what’s being sent is not analytics but related to notarizarion, verifying the app’s integrity (aka is it signed by a certificate known to Apple?)
This came to light a few years ago when the server went down and launching apps became impossible to slow…
I mean, the security features are pretty well documented. The FBI can't crack a modern iPhone even with Apple's help. A lot of the lockdowns are in service of that.
I'm curious: what hardware and software stack do you use?
Edit: I have not posted a source for this claim, because what sort of source would be acceptable for a claim of the form "X has not occurred"?
If you are going to claim Apple's security model has been compromised, you need not only evidence of such a compromise but also an explanation for why such an "obvious" and "cheap" vulnerability has not been disclosed by any number of white or grey-hat hackers.
"Since then, technologies like Grayshift’s GrayKey—a device capable of breaking into modern iPhones—have become staples in forensic investigations across federal, state, and local levels."
"In other cases where the FBI demanded access to data stored in a locked phone, like the San Bernardino and Pensacola shootings, the FBI unlocked devices without Apple’s help, often by purchasing hacking tools from foreign entities like Cellebrite."
> Apple is well on its way to ensure you can only run things they allow via app store
I'm very happy to only run stuff approved on Apple's app store... ESPECIALLY following their introduction of privacy labels for all apps so you know what shit the developer will try to collect from you without wasting your time downloading it.
Also have you seen the amount of dodgy shit on the more open app stores ?
It's a reasonable choice to do so and you can do it now.
The problem starts when Apple forbid it for people who want to install on their computer what they want.
> to purchase a machine that feels like it really belongs to me
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Of course I wish the hardware were somehow more open, but to a large extent, it's directly because of hardware based privacy features.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
As fas as I see, it's not possible to connect to a device that uses the same Apple account, which is what I have done in my case. It has to be a different one.
Also, it only seems to work on a local network with hostnames.
Some of us are old enough to remember the era of the officially authorised Apple clones in the 90's.
Some of us worked in hardware repair roles at the time.
Some of us remember the sort of shit the third-party vendors used to sell as clones.
Some of us were very happy the day Apple called time on the authorised clone industry.
The tight-knit integration between Apple OS and Apple Hardware is a big part of what makes their platform so good. I'm not saying perfect. I'm just saying if you look at it honestly as someone who's used their kit alongside PCs for many decades, you can see the difference.
> My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
“Hacker News” was always the arm of Valley startup mentality, not Slashdot-era Linux enthusiast privacy spook groupthink. It is unfortunate that this change has occurred.
It was mandated by right to repair laws, it provides the absolute minimum, and they've attempted the price out people wanting to do repairs. The only way it could be more hostile to users is by literally being illegal.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
Just to clarify, Asahi Linux is working on M3/M4 support. As far as I can tell nothing changed in the boot loader that makes this work more difficult, it just takes time to add the new hardware.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
> edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
"You need some cloud-based identity, and this is the best one," even granting its premises, doesn't make being forced into this one a good thing. I'm an Apple user, but there are plenty of people I need to message and share files with who aren't in the Apple ecosystem.
EDIT: As indicated in the reply (written before I added this edit), it sounds like I was ignoring the first part of the post, which pointed out that you aren't forced to use it. I agree that that is a sensible, and even natural and inevitable, reading. I actually wasn't ignoring that part, but I figured the only reason to include this edit was to say "that isn't true, but if it were true, then it would be OK." (Otherwise, what's the point? There's no more complete refutation needed of a false point than that it is false.) My argument is that, if it were true, then that wouldn't be OK, even if you need a cloud-based identity, and even if iCloud is the best one.
I had to set up a Windows computer for the first time in a decade recently, and holy shit did they make it difficult to figure out how to do it without a Microsoft account.
> MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
> Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want.
Try sharing a binary that you built but didn't sign and Notarize and you'll see the problem.
It'll run on the machine that it was built on without a problem, the problems start when you move the binary to another machine.
> I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
You either have have very low standards or very low understanding if you think a completely closed OS on top of completely closed hardware somehow means it 'really belongs' to you, or that your data/privacy is actually being respected.
It's not that bad anymore (e.g. with system 76), but I understand the point.
I disagree with OP celebrating Apple to be the least evil of the evils. Yes, there are not many (if any) alternatives, but that doesn't make Apple great. It's just less shitty.
It feels like a lot of people in these threads form their opinions of what desktop Linux is like these days based on one poor experience from back in 2005.
If you're so focused on privacy why don't you just use Linux? With Linux you'll actually get real privacy and you'll really truly own the system.
Apple takes a 30% tax on all applications running on their mobile devices. Just let that sink in. We are so incredibly lucky that never happened to PC.
As much as anyone can say otherwise, running Linux isn’t just a breeze. You will run into issues at some point, you will possibly have to make certain sacrifices regarding software or other choices. Yes it has gotten so much better over the past few years but I want my time spent on my work, not toying with the OS.
Another big selling point of Apple is the hardware. Their hardware and software are integrated so seamlessly. Things just work, and they work well. 99% of the time - there’s always edge cases.
There’s solutions to running Linux distros on some Apple hardware but again you have to make sacrifices.
Even on the machines most well-supported by Linux, which are Intel x86 PCs with only integrated graphics and Intel wifi/bluetooth, there are still issues that need to be tinkered away like getting hardware-accelerated video decoding working in Firefox (important for keeping heat and power consumption down on laptops).
I keep around a Linux laptop and it's improved immensely in the past several years, but the experience still has rough edges to smooth out.
They just have really good marketing. You fell for their pandering. If you really care about privacy use Linux. But Apple ain't it. Closed source and proprietary will never be safe from corporate greed.
They've certainly engaged in a lot of privacy theater before. For example
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
Does that mean you just don't bother encrypting any of your data, and just use unencrypted protocols? Since you can't inspect the ICs that are doing the work, encryption must all also be security theater.
That's a fine bit of goalpost shifting. They state that they will make their _entire software stack_ for Private Cloud Compute public for research purposes.
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
The approach that the big platforms have to producing their own versions of very successful apps cannibalizes their partners. This focus on consumer privacy by Apple is the company's killer competitive advantage in this particular area, IMO. If I felt they were mining me for my private business data I'd switch to Linux in heartbeat. This is what keeps me off Adobe, Microsoft Office, Google's app suite, and apps like Notion as much as possible.
Of late I have been imagining tears of joy rolling down the face of the person who decides to take it upon themself to sing the paeans of Apple Privacy Theatre on a given day. While Apple has been gleefully diluting privacy on their platforms (along with quality and stability of course). They are the masters at selling dystopian control, lock in, and software incompetence as something positive.
It's most dangerous that they own the closed hardware and they own the closed software and then they also get away with being "privacy champions". It's worse than irony.
You hit the nail on the head. And it’s something virtually everyone else replying to you is completely missing.
Apple isn’t perfect. They’re not better at privacy than some absolutist position where you run Tails on RISC V, only connect to services over Tor, host your own email, and run your own NAS.
But of all the consumer focused hardware manufacturers and cloud services companies, they are the only ones even trying.
Privacy is the new obscenity. What does privacy even mean to you concretely? Answer the question with no additional drama, and I guarantee you either Apple doesn’t deliver what you are asking for, or you are using services from another company, like Google, in a way that the actions speak that you don’t really care about what you are asking for.
> End to end encryption by default, such that the cloud provider cannot access my data.
The App Store stores a lot of sensitive data about you and is not end-to-end encrypted. They operate it just like everyone else. You also use Gmail, which is just as sensitive as your iMessages, and Gmail is not end-to-end encrypted, so it's not clear you value that as much as you say.
I think "could a creepy admin see my nudes" or "can my messages be mined to create a profile of my preferences" are much more practical working definitions of privacy than "can someone see that I've installed an app".
End-to-end encryption is certainly the most relevant feature for these scenarios.
App store DRM is a red herring, as a developer I can still run as much untrusted code on my MBP as I want and I don't see that going away any time soon.
You are saying a lot of words but none of them negate the point that Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Sure I use gmail, I've been locked in for 15 years. Someday I'll get fed up enough to bite the bullet and move off it.
> Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Apple can push updates and change the rules on your device at any time. Rooted Android works better in that regard: you can still use Google stuff on rooted devices. Also I don't think Apple's security posture for users in China is better than every "other big tech co."
The takeaway for me is that Apple's storytelling is really good. They are doing a good job on taking leadership on a limited set of privacy issues that you can convince busy people to feel strongly about. Whether or not that objectively matters is an open question.
There's some weird[1] laws around privacy in Australia, where government departments are blocked from a bunch of things by law. From my perspective as a citizen, this just results in annoyance such as having to fill out forms over and over to give the government data that they already have.
I heard a good definition from my dad: "Privacy for me is pedestrians walking past my window not seeing me step out of the shower naked, or my neighbours not overhearing our domestic arguments."
Basically, if the nude photos you're taking on your mobile phone can be seen by random people, then you don't have privacy.
Apple encrypts my photos so that the IT guy managing the storage servers can't see them. Samsung is the type of company that includes a screen-capture "feature" in their TVs so that they can profile you for ad-targeting. I guarantee you that they've collected and can see the pictures of naked children in the bathtub from when someone used screen mirroring from their phone to show their relatives pictures of their grandkids. That's not privacy.
Sure, I use Google services, but I don't upload naked kid pictures to anything owned by Alphabet corp, so no problem.
However, I will never buy any Samsung product for any purpose because they laugh and point at customer expectations of privacy.
[1] Actually not that weird. Now that I've worked in government departments, I "get" the need for these regulations. Large organisations are made up of individuals, and both the org and the individual people will abuse their access to data for their own benefit. Many such people will even think they're doing the "right thing" while destroying freedom in the process, like people that keep trying to make voting systems traceable... so that vote buying will become easy again.
That was the result of social engineering though, not iCloud being compromised. AFAIK it was a phishing scam, asking the victims for their usernames and passwords.
> asking the victims for their usernames and passwords.
This should illuminate for you that there is nothing special about iCloud privacy or security, in any sense. It has the same real weaknesses as any other service that is UIs for normal people.
>I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.
These numbers are misleading (as in not apples to apples comparison). M4 has a matrix multiply hardware extension which can accelerate code written (or compiled) specifically for this extension.
I wonder how reliable geekbench tests are. Afaik it's the most common benchmark run on apple devices, so apple has a great interest in making sure their newest chips perform great on the test.
I wouldn't be surprised to hear that the geekbench developers are heavily supported by apple's own performance engineers and that testing might not be as objective or indicative of real world perf as one would hope.
> I feel like Apple kind of buried this in their press release
The press release describes the single core performance as the fastest ever made, full stop:
"The M4 family features phenomenal single-threaded CPU performance with the world’s fastest CPU core"
The same statement is made repeatedly across most the new M4 line up marketing materials. I think thats enough to get the point across that its a pretty quick machine.
Not really. Intel CPU performance hasn't changed by orders of magnitude in the last ten years. My ten year old Windoze 10 desktop keeps chugging along fine. My newer 2022 i7 Windows machine works similarly well.
However, attention to keeping Intel Macs performant has taken a dive. My 2019 16" MBP died last week so I fell back to my standby 2014 MBP and it's much more responsive. No login jank pause .
But it also hasn't been eligible for OS updates for 2 or 3 years.
My new M3 MBP is "screaming fast" with Apple's latest patched OS.
My god, it's ridiculous. I really prefer Linux desktops. They've been snappy for the past 30 years, and don't typically get slow UI's after a year or two of updates.
I don't know much about modern Geekbench scores, but it that chart seems to show that M1s are still pretty good? It appears that M4 is only about 50% faster. Somehow I would expect more like 100% improvement.
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
Some of it depends on which variant fits you best. But yeah, in general the M1 is still very good--if you hear of someone in your circle selling one for cheap because they're upgrading, nab it.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
> it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move,
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
Huh! I regularly max CPU for long stretches (game development), but I found I could only get the fans to move if I engaged the neural cores on top of everything else. Something like a 20+ minute video export that's using all available compute for heavy stabilization or something could do it.
The M3 is much more typical behavior, but I guess it's just dumping more watts into the same thermal mass...
The M1 was pretty fast when it debuted. If you own an M1 Mac its CPU has not gotten any slower over the years. While newer M-series might be faster, the old one is no slower.
The M1s are likely to remain pretty usable machines for a few years yet, assuming your workload has not or does not significantly change.
It's not really buried... their headline stat is that it's 1.8x faster than the M1, which is actually a bigger improvement than the actual Geekbench score shows (it would be a score of 4354).
Call me cynical, but when I see headlines like "up to 2x faster", I assume it's a cherry-picked result on some workload where they added a dedicated accelerator.
There's a massive difference between "pretty much every app is 80% faster" and "if you render a 4K ProRes video in Final Cut Pro it's 3x faster."
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
You'll be glad you did. I loved my 2015 MBP. I even drove 3 hours to the nearest Best Buy to snag one. That display was glorious. A fantastic machine. I eventually gave it to my sister, who continued using it until a few years ago. The battery was gone, but it still worked great.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
The performance jump between a top-of-the-line intel MBP (I don't remember the year, probably 2019) and the m1 max I got to replace it.. was rather like the perf jump between spinning disks and SSDs.
When I migrated all my laptops to SSDs (lenovos at the time, so it was drop-dead simple), I thought to myself, "this is a once-in-a-generation feeling". I didn't think I would ever be impressed by a laptop's speed ever again. It was nice to be wrong.
> The battery was gone, but it still worked great.
A family 2018 Macbook Air got a second life with a battery replacement. Cheap kit from Amazon, screwdrivers included, extremely easy to do. Still in use, no problems.
My 2015 15" MBP is also still kickin, is/was an absolutely fabulous unit. Was my work machine for 3-4 years, and now another almost-6 years as my personal laptop. My personal use case is obviously not very demanding but it's only now starting to really show its age.
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
I was also a mid-2012 MBP user. I eventually got the M2 MBA because I was investing in my eyesight (modern displays are significantly better). I was never impressed with the touchbar-era macs, they didn't appeal to me and their keyboards were terrible.
I think this M-series macbook airs are a worthy successor to the 2012 MBP. I fully intend to use this laptop for at least the same amount of time, ideally more. The lack of replaceable battery will probably be the eventual killer, which is a shame.
That is amazing. Mine lasted for a super long time as well, and like you, I upgraded everything to its max. I think it was the last model with a 17 inch screen.
Sold mine last year for $100 to some dude who claimed to have some software that only runs on that specific laptop. I didn't question it.
I still have my 2015, and it lived just long enough to keep me going until the death of the touch bar and horrible keyboard, which went away when I immediately bought the M1 Pro on release day.
I was considering upgrading to an M3 up until about a month ago when Apple replaced my battery, keyboard, top case, and trackpad completely for free. An upgrade would be nice as it no longer supports the latest MacOS, but at this point, I may just load Ubuntu on the thing and keep using it for another few years. What a machine.
I used that same model for 5 years until I finally upgraded in 2017 and totally regretted it, the upgrade was not worth it at all, I would have been just as happy with the 2012. I quickly replaced it again with the "Mea Culpa" 2019 where they added back in ports, etc, would have been just about worth the upgrade over the 2012, 7 years later, but again, not by a big margin.
The 2012 MBP 15" Retina was probably the only machine I bought where the performance actually got better over the years, as the OS got more optimized for it (the early OS revisions had very slow graphics drivers dealing with the retina display)
The M1 Pro on the other hand, that was a true upgrade. Just a completely different experience to any Apple Intel laptop.
I've just replaced a 2012 i5 mbp, and used it for Dev work and presentations into 2018.
It has gotten significantly slower the last 2 years, but the more obvious issue is the sound, inability to virtual background, and now lack of software updates.
But if you had told me I'd need to replace it in 2022 I wouldn't believe you
Ah my 2013 mbp died in 2019. It was the gpu. No way to repair it for cheap enough so I had to replace it with a 2019 mbp which was the computer I kept the shortest (I hated the keyboard).
How do you justify this kind of recurring purchases, even with selling your old device? I don't get the behaviour or the driving decision factor past the obvious "I need the latest shiny toy" (I can't find the exact words to describe it, so apologies for the reductive description).
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
I spend easily 100 hours a week using it not-as-balanced-as-it-should-be between the two.
I don't buy them because I need something new, I buy them because in the G4/Intel era, the iterations were massive and even a 20 or 30% increase in speed (which could be memory, CPU, disk -- they all make things faster) results in me being more productive. It's worth it for me to upgrade immediately when apple releases something new, as long as I have issues with my current device and the upgrade is enough of a delta.
M1 -> M2 wasn't much of a delta and my M1 was fine.
M1 -> M3 was a decent delta, but, my M1 was still fine.
M1 -> M4 is a huge delta (almost double) and my screen is dented to where it's annoying to sit outside and use the laptop (bright sun makes the defect worse), so, I'm upgrading. If I hadn't dented the screen the choice would be /a lot/ harder.
I love ThinkPads too. Really can take a beating and keep on going. The post-IBM era ones are even better in some regards too. I keep one around running Debian for Linux-emergencies.
There are 2 things I was always spending money on, if I felt is not the almost best achievable: my bed and my laptop. Even the phone can be 4 years old iPhone, but the laptop must be best and fast. My sleep is also pretty important. Everything else is just "eco".
In my country you can buy a device and write off in 2 years, VAT reimbursed, then scrap it from the books and you sell it to people without tax payed to people who otherwise would pay a pretty hefty VAT. This decreases your loss of value to like half.
Apple has a pretty good trade-in program. If you have an Apple card, it's even better (e.g. the trade-in value is deducted immediately, zero interest, etc.).
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
>I've never kept any laptop as long as I've kept the M1
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
Interesting. I have found occasion to use it for pretty much every Mac I've owned since the 1980s! I'm not sure how much money it's saved compared to just paying for repairs when needed, but I suspect it may come out to:
1) a slight overall savings, though I'm not sure about that.
2) a lack of stress when something breaks. Even if there isn't an overall savings, for me it's been worth it because of that.
Certainly, my recent Mac repair would have cost $1500 and I only paid $300, and I think I've had the machine for about 3 years, so there's a savings there but considerably less recent stress. That's similar to the experience I've had all along, although this recent expense would have probably been my most-expensive repair ever.
"SSD is soldered on" is a bit of glossing over of the issue with the M-series Macs.
Apple is putting raw NAND chips on the board (and yes soldering them) and the controller for the SSD is part of the M-series chip. Yes, apple could use NVMe here if you ignore the physical constraints and ignore fact that it wouldn't be quite as fast and ignore the fact that it would increase their BOM cost.
I'm not saying Apple is definitively correct here, but, it's good to have choice and Apple is the only company with this kind of deeply integrated design. If you want a fully modular laptop, go buy a framework (they are great too!) and if you want something semi-modular, go buy a ThinkPad (also great!).
Yeah, I always have AppleCare. I view it as part of the cost of a mac (or iPhone).
And yeah, this incident reminded me of why it's important to back up as close to daily as you can, or even more often during periods when you're doing important work and want to be sure you have the intermediate steps.
I had an 2019 i9 for a work laptop. It was absolutely awful, especially with the corporate anti-virus / spyware on it that brought it to a crawl. Fans would run constantly. Any sort of Node JS build would make it sound like a jet engine.
I have the OG 13" MBP M1, and it's been great; I only have two real reasons I'm considering jumping to the 14" MBP M4 Pro finally:
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
Yep. That's roughly 20% per generation improvement which ain't half-bad these days, but the really huge cliff was going from Intel to the M1 generation.
M1 series machines are going to be fine for years to come.
It feels like M1 was the revolution, subsequent ones evolution - smaller fabrication process for improved energy efficiency, more cores for more power, higher memory (storage?) bandwidth, more displays (that was a major and valid criticism for the M1 even though in practice >1 external screens is a relatively rare use case for <5% of users).
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
It's a very robust and capable small laptop. I'm typing this to a M1 Macbook Air.
The only thing to keep in mind, is that the M1 was the first CPU in the transition from Intel CPUs (+ AMD GPUs) to Apple Silicon. The M1 was still missing a bunch of things from earlier CPUs, which Apple over time added via the M1 Pro and other CPUs. Especially the graphics part was sufficient for a small laptop, but not for much beyond. Better GPUs and media engines were developed later. Today, the M3 in a Macbook Air or the M4 in the Macbook Pro have all of that.
For me the biggest surprise was how well the M1 Macbook Air actually worked. Apple did an outstanding job in the software & hardware transition.
Yep. Bought an M1 Max in 2021 and it still feels fast, battery lasts forever. I’m sure the M4 would be even quicker (Lightroom, for example) but there’s little reason to consider an upgrade any time soon.
I still use my MacBook Air M1 and given my current workloads (a bit of web development, general home office use and occasional video editing and encoding) I doubt I’ll need to replace it in the coming 5 years. That’ll be an almost 10 year lifespan.
In early 2020, I had an aging 2011 Air that was still struggling after a battery replacement. Even though I "knew" the Apple Silicon chips would be better, I figured a 2020 Intel Air would last me a long time anyway, since my computing needs from that device are light, and who knows how many years the Apple Silicon transition will take take anyway?
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
Exact same boat here. A friend and I both bought the 2020 Intel MBA thinking that the M1 version was at least a year out. It dropped a few months later. I immediately resold my Intel MBA seeing the writing on the wall and bought a launch M1 (which I still use to this day). Ended up losing $200 on that mis-step, but no way the Intel version would still get me through the day.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
When I saw the M1s come out, I thought that dev tooling would take a while to work for M1, which was correct. It probably took a year for most everything to be compiled for arm64. However I had too little faith in Rosetta and just the speed upgrade M1 really brought. So what I mean to say is, I still have that deadweight MBA that I only use for web browsing :)
Oh yes, my wife bought a new Intel MBA in summer 2020... I told her at the time Apple planned its own chip, but it couldn't be much better than the Intel one and surely Apple will increase prices too... I was so wrong.
I switched from a 2014 MacBook pro to a 2020 M1 MacBook Air, yeah the CPU is much faster, but the build quality and software is a huge step backwards. The trackpad is feels fake, not nearly as responsive, keyboard also feel not as solid. But now I'm already used to it.
With whisky i feel like id never need anything else. That said, the benchmark jump in the m4 has me thinking i should save up and grab a refurb in a year or two
M1 Pro compared to Intel was so big step ahead that I suppose we all are still surprised and excited. Quiet, long battery life and better performance. By a lot!
I wonder if M4 really feels that much faster and better - having M1 Pro I'm not going to change quickly, but maybe Mac Mini will land some day.
Honestly it was a game changer. Before I'd never leave the house without a charger, nowadays I rarely bring it with me on office days, even with JS / front-end workloads.
(of course, everyone else has a macbook too, there's always someone that can lend me a charger. Bonus points that the newer macbooks support both magsafe and USB-C charging. Added bonus points that they brought back magsafe and HDMI ports)
I have the same one, but everyone I know with an M series Mac says the same thing. These are the first machines in a long time built to not only last a decade but be used for it.
I have the M1 MacBook Pro with MagSafe and I still charge it via USB C simply because I can't be bothered to carry around another cable when all of my other peripherals are USB C.
It's the other way around, isn't it? MagSafe was removed in the 2016-2019 model years (not sure why; maybe to shave off another bit of thickness?), and then brought back in 2020 to MacBook Pro and 2022 to MacBook Air.
Personally, I practically never use MagSafe, because the convenience of USB C charging cables all over the house outweighs the advantages of MagSafe for me.
Apple is using LPDDR5 for M3. The bandwidth doesn't come from unified memory - it comes from using many channels. You could get the same bandwidth or more with normal DDR5 modules if you could use 8 or more channels, but in the PC space you don't usually see more than 2 or 4 channels (only common for servers).
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide. Normal CPU cores running normal codes are not particularly bandwidth limited. However APUs/iGPUs are severely bandwidth limited, thus the huge number of slow iGPUs that are fine for browsing but terrible for anything more intensive.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
> The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
Yeah, a lot of it is just market forces. I guess going to four channels is costly for the desktop PC space and that's why that didn't happen, and laptops just kind of followed suite. But now that Apple is putting pressure on the market, perhaps we'll finally see quad channel becoming the norm in desktop PCs? Would be nice...
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
> Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
FYI cache lines are 64 bytes, not bits. So Apple is using 128 bytes.
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
The M4 max is definitely not a 4090 killer, does not match it in any way. It can however work on larger models than the 4090 and have a battery that can last all day.
My memory is a bit fuzzy, but I believe the m3 max did decent on some games compared to the laptop Nvidia 4070 (which is not the same as the desktop 4070). But highly depended on if the game was x86-64 (requiring emulation) and if it was DX11 or apple native. I believe apple claims improvements in metal (the Apple's GPU lib) and that the m4 GPUs have better FP for ray tracing, but no significant changes in rasterized performance.
I look forward to the 3rd party benchmarks for LLM and gaming on the m4 max.
Eh… not quite. Maybe on an Instinct. Unified memory means the CPU and CPU means they can do zero copy to use the same memory buffer.
Many integrated graphics segregate the memory into CPU owned and GPU owned, so that even if data is on the same DIMM, a copy still needs to be performed for one side to use what the other side already has.
This means that the drivers, etc, all have to understand the unified memory model, etc. it’s not just hardware sharing DIMMs.
Yes, you could buy a brand new (announced weeks ago) AMD Turin. 12 channels of DDR5-6000, $11,048 and 320 watts (for the CPU) and get 576GB/sec peak.
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
AnandTech did bandwidth benchmarks for the M1 Max and was only able to utilize about half of it from the CPU, and the GPU used even less in 3D workloads because it wasn't bandwidth limited. It's not all about bandwidth. https://www.anandtech.com/show/17024/apple-m1-max-performanc...
Indeed. RIP Anandtech. I've seen bandwidth tests since then that showed similar for newer generations, but not the m4. Not sure if the common LLM tools on mac can use CPU (vector instructions), AMX, and Neural engine in parallel to make use of the full bandwidth.
You lose out on things like expandability (more storage, more PCIe lanes) and repairability though. You are also (on M4 for probably a few years) compelled to use macOS, for better or worse.
There are, in my experience, professionals who want to use the best tools someone else builds for them, and professionals who want to keep iterating on their tools to make them the best they can be. It's the difference between, say, a violin and a Eurorack. Neither's better or worse, they're just different kinds of tools.
I was sorely tempted by the Mac studio, but ended up with a 96GB ram Ryzen 7900 (12 core) + Radeon 7800 XT (16GB vram). It was a fraction of the price and easy to add storage. The Mac M2 studio was tempting, but wasn't refreshed for the M3 generation. It really bothered me that the storage was A) expensive, B) proprietary, C) tightly controlled, and D) you can't boot without internal storage.
Even moving storage between Apple studios can be iffy. Would I be able to replace the storage if it died in 5 years? Or expand it?
As tempting as the size, efficiency, and bandwidth were I just couldn't justify top $ without knowing how long it would be useful. Sad they just didn't add two NVMe ports or make some kind of raw storage (NVMe flash, but without the smarts).
> Even moving storage between Apple studios can be iffy.
This was really driven home to me by my recent purchase of an Optane 905p, a drive that is both very fast and has an MTBF measured in the hundreds of years. Short of a power surge or (in California) an earthquake, it's not going to die in my lifetime -- why should I not keep using it for a long time?
Many kinds of professionals are completely fine with having their Optanes and what not only be plugged in externally, though, even though it may mean their boot drive will likely die at some point. That's completely okay I think.
I doubt you'll get 10+ hours on battery if you utilize it at max. I don't even know if it can really sustain the maximum load for more than a couple of minutes because of thermal or some other limits.
FWIW I ran a quick test of gemma.cpp on M3 Pro with 8 threads. Similar PaliGemma inference speed to an older AMD (Rome or Milan) with 8 threads. But the AMD has more cores than that, and more headroom :)
Yeah memory bandwidth is one of the really unfortunate things about the consumer stuff. Even the 9950x/7950x, which are comfortably workstation-level in terms of compute, are bound by their 2 channel limits. The other day I was pricing out a basic Threadripper setup with a 7960x (not just for this reason but also for more PCIe lanes), and it would cost around $3000 -- somewhat out of my budget.
This is one of the reasons the "3D vcache" stuff with the giant L3 cache is so effective.
"Unified memory" doesn't really imply anything about the memory being located on-package, just that it's a shared pool that the CPU, GPU, etc. all have fast access to.
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
It's mostly an IBM thing. In the consumer space, it's been in game consoles with IBM-fabbed chips. Intel's use of eDRAM was on a separate die (there was a lot that was odd about those parts).
For comparison, a Threadripper Pro 5000 workstation with 8x DDR4 3200 has 204.8GB/s of memory bandwidth.
The Threadripper Pro 7000 with DDR5-5200 can achieve 325GB/s.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
Thanks for the numbers. Someone here on hackernews got me convinced that a Threadripper would be a better investment for inference than a MacBook Pro with a M3 Max.
> So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
If you're referring to the line you quoted, then no, it's not wrong. Each DIMM is perfectly capable of 64GiB/s, just as the article says. Where it might be confusing is that this article seems to only be concerning itself with the DIMM itself and not with the memory controller on the other end. As the other reply said, the actual bandwidth available also depends on the number of memory channels provided by the CPU, where each channel provides one DIMM worth of bandwidth.
This means that in practice, consumer x86 CPUs have only 128GiB/s of DDR5 memory bandwidth available (regardless of the number of DIMM slots in the system), because the vast majority of them only offer two memory channels. Server CPUs can offer 4, 8, 12, or even more channels, but you can't just install 16 DIMMs and expect to get 1024GiB/s of bandwidth, unless you've verified that your CPU has 16 memory channels.
It's not the memory being unified that makes it fast, it's the combination of the memory bus being extremely wide and the memory being extremely close to the processor. It's the same principle that discrete GPUs or server CPUs with onboard HBM memory use to make their non-unified memory go ultra fast.
No, unified memory usually means the CPU and GPU (and miscellaneous things like the NPU) all use the same physical pool of RAM and moving data between them is essentially zero-cost. That's in contrast to the usual PC setup where the CPU has its own pool of RAM, which is unified with the iGPU if it has one, but the discrete GPU has its own independent pool of VRAM and moving data between the two pools is a relatively slow operation.
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
I don't quite understand one of the finer points of this, under caffeinated :) - if GPU memory is extremely close to the CPU memory, what sort of memory would not be extremely close to the CPU?
I think you misunderstood what I meant by "processor", the memory on a discrete GPU is very close to the GPUs processor die, but very far away from the CPU. The GPU may be able to read and write its own memory at 1TB/sec but the CPU trying to read or write that same memory will be limited by the PCIe bus, which is glacially slow by comparison, usually somewhere around 16-32GB/sec.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
I thought it meant that both the GPU and the CPU can access it. In most systems, GPU memory cannot be accessed by the CPU (without going through the GPU); and vice versa.
CPUs access GPU memory via MMIO (though usually only a small portion), and GPUs can in principle access main memory via DMA. Meaning, both can share an address space and access each other’s memory. However, that wouldn’t be called Unified Memory, because it’s still mediated by an external bus (PCIe) and thus relatively slower.
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
That seems unlikely given the mismatched memory speed (see the parent comment) and the fact that Apple uses LPDDR which is typically 16 bits per channel. 8800MT/s seems to be a number pulled out of thin air or bad arithmetic.
Heh, ok, maybe slightly different. But apple spec claims 546GB/sec which works out to 512 bits (64 bytes) * 8533. I didn't think the point was 8533 vs 8800.
I believe I saw somewhere that the actual chips used are LPDDR5X-8533.
Effectively the parents formula describes the M4 max, give or take 5%.
Fewer libraries? Any that a normal LLM user would care about? Pytorch, ollama, and others seem to have the normal use cases covered. Whenever I hear about a new LLM seems like the next post is some mac user reporting the token/sec. Often about 5 tokens/sec for 70B models which seems reasonable for a single user.
Is there a normal LLM user yet? Most people would want their options to be as wide as possible. The big ones usually get covered (eventually), and there are distinct good libraries emerging for Mac only (sigh), but last I checked the experience of running every kit (stable diffusion, server-class, etc) involved overhead for the Mac world.
> This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
A 24gb model is fast and ranks 3.
A 70b model is slow and 8.
A top tier hosted model is fast and 100.
Past what specialized models can do, it's about a mixture/agentic approach and next level, nuclear power scale. Having a computer with lots of relatively fast RAM is not magic.
Thanks, but just to put things into perspective, this calculation has counted 8 channels which is 4 DIMMs and that's mostly desktops (not dismissing desktops, just highlighting that it's a different beast).
Desktops are two channels of 64 bits, or with DDR5 now four (sub)channels of 32 bits; either way, mainstream desktop platforms have had a total bus width of 128 bits for decades. 8x64 bit channels is only available from server platforms. (Some high-end GPUs have used 512-bit bus widths, and Apple's Max level of processors, but those are with memory types where the individual channels are typically 16 bits.)
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
I don't think so? That PDF I linked is from 2015, way before Apple put focus on it through their M-series chips... And the Wikipedia article on "Glossary of computer graphics" has had an entry for unified memory since 2016: https://en.wikipedia.org/w/index.php?title=Glossary_of_compu...
For Apple to have come up with using the term "unified memory" to describe this kind of architecture, they would've needed to come up with it at least before 2016, meaning A9 chip or earlier. I have paid some attention to Apple's SoC launches through the years and can't recall them touting it as a feature in marketing materials before the M1. Do you have something which shows them using the term before 2016?
To be clear, it wouldn't surprise me if it has been used by others before Intel did in 2015 as well, but it's a starting point: if Apple hasn't used the term before then, we know for sure that they didn't come up with it, while if Apple did use it to describe A9 or earlier, we'll have to go digging for older documents to determine whether Apple came up with it
There are actual differences but they're mostly up to the drivers. "Shared" memory typically means it's the same DRAM but part of it is carved out and can only be used by the GPU. "Unified" means the GPU/CPU can freely allocate individual pages as needed.
I'm curious about getting one of these to run LLM models locally, but I don't understand the cost benefit very well. Even 128GB can't run, like, a state of the art Claude 3.5 or GPT 4o model right? Conversely, even 16GB can (I think?) run a smaller, quantized Llama model. What's the sweet spot for running a capable model locally (and likely future local-scale models)?
You'll be able to run 72B models w/ large context, lightly quantized with decent'ish performance, like 20-25 tok/sec. The best of the bunch are maybe 90% of a Claude 3.5.
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
We run our LLM workloads on a M2 Ultra because of this. 2x the VRAM; one-time cost at $5350 was the same as, at the time, 1 month of 80GB VRAM GPU in GCP. Works well for us.
Can you elaborate, are those workflows in queue or can they serve multiple users in parallel ?
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources.
Thanks !
At some point there should be an upgrade to the M2 Ultra. It might be an M4 Ultra, it might be this year or next year. It might even be after the M5 comes out. Or it could be skipped in favour of the M5 Ultra. If anyone here knows they are definitely under NDA.
They aren't going to be using fp32 for inferencing, so those FP numbers are meaningless.
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
About 10-20% of my companies gpu usage is inference dev. Yes horribly not efficient usage of resources. We could upgrade the 100ish devs who do this dev work to M4 mbp and free up gpu resources
High availability story for AI workloads will be a problem for another decade. From what I can see the current pressing problem is to get stuff working quickly and iterate quickly.
Having 128GB is really nice if you want to regularly run different full OSes as VMs simultaneously (and if those OSes might in turn have memory-intensive workloads running on them).
At least in the recent past, a hindrance was that MacOS limited how much of that unified memory could be assigned as VRAM. Those who wanted to exceed the limits had to tinker with kernel settings.
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
I am always wondering if one shouldn't be doing the resource intensive LLM stuff in the cloud. I don't know enough to know the advantages of doing it locally.
you'd probably save money just paying for a VPS. And you wouldn't cook your personal laptop as fast. Not that people nowadays keep their electronics for long enough for that to matter :/
This is definitely tempting me to upgrade my M1 macbook pro. I think I have 400GB/s of memory bandwidth. I am wondering what the specific number "over half a terabyte" means.
The weird thing about these Apple product videos in the last few years is that there are all these beautiful shots of Apple's campus with nobody there other than the presenter. It's a beautiful stage for these videos, but it's eerie and disconcerting, particularly given Apple's RTO approach.
You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
It's a strange conflict. So much of their other stuff is about togetherness mediated by technology (eg, facetime). And their Jobs-era presentations always ended with a note of appreciation for the folks who worked so hard to make the launch happen. But you're right that much of the brand imagery is solitary, right up to the whole "Here's to the crazy ones" vibe.
It's weirdly dystopian. I didn't realize it bothered me until moments before my comment, but now I can't get it out of my head.
If only in some shots, but they are such a valuable company that they simply cannot afford the risk of e.g. criticism for the choice of people they display, or inappropriate outfits or behaviour. One blip from a shareholder can cost them billions in value, which pisses off other shareholders. All of their published media, from videos like this to their conferences, are highly polished, rehearsed, and designed by committee. Microsoft and Google are the same, although at least with Google there's still room for some comedy in some of their departments: https://youtu.be/EHqPrHTN1dU
> You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
There are legal issues with it too, or at least they think there are. They take down developer presentations after a few years partly so they won't have videos of random (ex-)employees up forever.
I interviewed there in 2017 and honestly even back then the interior of their campus was kind of creepy in some places. The conference rooms had this flat, bland beige that reminded me of exactly the kind of computers the G3 era was trying to get away from, but the size of a room, and you were inside it.
I used to think the videos with all of the drone fly-bys was cool. But in the last year or so, I've started to feel the same as you. Where are all the people? It's starting to look like Apple spent a billion dollars building a technology ghost town.
Surely the entire staff can't be out rock climbing, surfing, eating at trendy Asian-inspired restaurants at twilight, and having catered children's birthday parties in immaculately manicured parks.
Oh I think they're very well done and very pretty! But lately this discomfort has started to creep in, as you note. Like something you'd see in a WALL-E spinoff: everyone has left the planet already but Buy n Large is still putting out these glorious promo videos using stock footage. Or, like, post-AI apocalypse, all the humans are confined to storage bins, but the proto-AI marketing programs are still churning out content.
> MacBook Pro with M4 Pro is up to 3x faster than M1 Pro (13)
> (13) Testing conducted by Apple from August to October 2024 using preproduction 16-inch MacBook Pro systems with Apple M4 Pro, 14-core CPU, 20-core GPU, 48GB of RAM and 4TB SSD, and production 16-inch MacBook Pro systems with Apple M1 Pro, 10-core CPU, 16-core GPU, 32GB of RAM and 8TB SSD. Prerelease Redshift v2025.0.0 tested using a 29.2MB scene utilising hardware-accelerated ray tracing on systems with M4 Pro. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
So they're comparing software that uses raytracing present in the M3 and M4, but not in the M1. This is really misleading. The true performance increase for most workloads is likely to be around 15% over the M3. We'll have to wait for benchmarks from other websites to get a true picture of the differences.
Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
14-inch MacBook Pro with M4 4.5x
14-inch MacBook Pro with M3 3.8x
13-inch MacBook Pro with M1 2.7x
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.
I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
It seems you need the M4 Max with the 40-core GPU to go over 36GB.
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
Somehow I got downvoted for pointing this out, but it's weird that you have to spend an extra $800 USD just to be able to surpass 48gb, and "upgrading" to the base level Max chip decreases your ram limit, especially when the M4 Pro on the Mac Mini goes up to 64gb. Like... that's a shit load of cash to put out if you need more ram but don't care for more cores. I was really hoping to finally upgrade to something with 64gb, or maybe 96 or 128 if it decreased in price, but it's they removed the 96 and kept 64 and 128 severely out of reach.
Do I get 2 extra CPU cores, build a budget gaming PC, or subscribe to creative suite for 2.5 years!?
I haven't done measurements on this, but my Macbook Pro feels much faster at swapping than any Linux or Windows device I've used. I've never used an M.2 SSD so maybe that would be comparable, but swapping is pretty much seamless. There's also some kind of memory compression going on according to Activity Monitor, not sure if that's normal on other OSes.
Yes, other M.2 SSDs have comparable performance when swapping, and other operating systems compress memory, too — though I believe not as much as MacOS.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
Degrading the SSD is a good point. This is thankfully a work laptop so I don't care if it lives or dies, but it's something I'll have to consider when I eventually get my own Mac.
It looks like different versions of the ‘Pro’ based on core count and memory bandwidth. Im assuming the 12c Mini M4 Pro has the same memory bandwidth/channels enabled as the 14c MBP M4 Pro, enabling the 64GB. My guess would be related to binning and or TDP.
Can anyone comment on the viability of using an external SSD rather than upgrading storage? Specifically for data analysis (e.g. storing/analysing parquet files using Python/duckdb, or video editing using divinci resolve).
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
Don't bother with thunderbolt 4, go for USB 4 enclosure instead - I've got a Jeyi one. Any SSD will work, I use a Samsung 990 pro inside. It was supposed to be the fastest you can get - I get over 3000MB/s.
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
I'm a little sus of owc these days, their drives are way expensive, never get any third-party reviews or testing, and their warranty is horrible (3 years). I've previously swore by them so it's a little disappointing
Hopefully in the next days/weeks we’ll see TB5 external enclosures and you’ll be able to hit very fast speeds with the new Macs. I would wait for those before getting another enclosure now.
Afaik the main oem producer is Winstars, though I could only find sketchy-looking Aliexpress seller so far.
Basically any good SSD manufacturer is fine, but I've found that the enclosure controller support is flaky with Sonoma. Drives that appear instantly in Linux sometimes take ages to enumerate in OSX, and only since upgrading to Sonoma. Stick with APFS if you're only using it for Mac stuff.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
The USB-C ports should be quite enough for that. If you are using a desktop Mac, such as an iMac, Mini, or the Studio and Pro that will be released later this week, this is a no-brainer - everything works perfectly.
i go with the acasis thunderbolt enclosure and then pop in an nvme of your choice, but generic USB drives are pretty viable too ... thunderbolt can be booted from, while USB can't
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
I've used a Samsung T5 SSD as my CacheClip location in Resolve and it works decently well! Resolve doesn't always tolerate disconnects very well, but when it's plugged in things are very smooth.
With a thunderbolt SSD you'll think your external drive is an internal drive. I bought one of these (https://www.amazon.com/gp/product/B0BGYMHS8Y) for my partner so she has snappy photo editing workflows with Adobe CC apps. Copying her 1TB photo library over took under 5 min.
I had a big problem with crucial 4tb ssds recently, using them as time machine drives. The first backup would succeed, the second would fail and the disk would then be unrepairable in disk utility, which also will refuse to format to non-apfs (and an apfs reformat wouldn't fix it).
I edit all my video content from a USB-attached SSD with Resolve on my MBP.
My only complaint is that Apple gouges you for memory and storage upgrades. (But in reality I don't want the raw and rendered video taking up space on my machine).
> All MacBook Pro models feature an HDMI port that supports up to 8K resolution, a SDXC card slot, a MagSafe 3 port for charging, and a headphone jack, along with support for Wi-Fi 6E and Bluetooth 5.3.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I was quite surprised by this discrepancy as well (my new iPhone has 7, but the new MBP does not).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
> It looks like few people only are using Wifi 7 for now.
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
Hm why? Is 6E really so much worse than 7 in practice that 7 can replace wired for you but 6E can't? That's honestly really weird to me. What's the practical difference in latency, bandwidth or reliability you've experienced between 6E and 7?
I don’t have any 6E device so I cannot really tell for sure but from what I read, 6E gets you to a bit over 1Gbit in real world scenario. 7 should be able to replace my 2.5Gbe dongle or at least get much closer to it. I already have routers WiFi 7 Eeros on a 2.5Gbe wired backbone.
I guess it makes sense if what you do is extremely throughput-focused... I always saw consistency/reliability and latency as the benefits of wired compared to wireless, the actual average throughput has felt fast enough for a while on WiFi but I guess other people may have different needs
Yeah, this threw me as well. When the iMac didn’t support WiFi 7, I got a bit worried. I have an M2, so not going to get this, but the spouse needs a new Air and I figure that everything would have WiFi 7 by then, and now I don’t think so.
Faster is always nice, makes sense. But do you really need WiFi 7 features/speed? I don't know when I would notice a difference (on a laptop) between 600 or 1500 Mbit/s (just as an example). Can't download much anyhow as the storage will get full in minutes.
This is the first compelling Mac to me. I've used Macs for a few clients and muscle memory is very deeply ingrained for linux desktops. But with local LLMs finally on the verge of usability along with sufficient memory... I might need to make the jump!
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
IIRC one of the major factors holding back M3 support was the lack of a M3 mini for use in their CI environment. Now that there's an M4 mini hopefully there aren't any obstacles to them adding M4 support
How? What cloud providers offer it? MacStadium and AWS don't.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
How do you imagine that a cloud computing platform designed around running Macs with macOS would work for testing an entirely different OS running on bare metal on hardware that doesn't have a BMC, and usefully catching and logging frequent kernel panics and failed boots?
It's a pretty hard problem to partially automate for setups with an engineer in the room. It doesn't sound at all feasible for an unattended data center setup that's designed to host Xcode for compiling apps under macOS.
I miss Linux, it respected me in ways that MacOS doesn't. But maintaining a sane dev environment on linux when my co-workers on MacOS are committing bash scripts that call brew... I am glad that I gave up that fight. And yeah, the hardware sure is nice.
IIRC brew supports linux, but it isn't a package manager I pay attention to outside of some very basic needs. Way too much supply chain security domain to cover for it!
It does, but I prefer to keep project dependencies bound to that project rather than installing them at wider scope. So I guess it's not that I can't use Linux for work, but that I can't use Linux for work and have it my way. And if I can't have it my way anyway, then I guess Apple's way will suffice.
In general for local LLMs, the more memory the better. You will be able to fit larger models in RAM. The faster CPU will give you more tokens/second, but if you are just chatting with a human in the loop, most recent M series macs will be able to generate tokens faster than you can read them.
That also very much depends on model size. For 70B+ models, while the tok/s are still fast enough for realtime chat, it's not going to be generating faster than you can read it, even on Ultra with its insane memory bandwidth.
NextSTEP which macOS is ultimately based on is indeed older than Linux (first release was 1989). But why does that matter? The commenter presumably said "Linux" for a reason, i.e. they want to use Linux specifically, not any UNIX-like OS.
Sure. But not everybody. That’s how I ended up on a Mac. I needed to develop for Linux servers and that just sucked on my windows laptop (I hear it’s better now?). So after dual booting fedora on my laptop for several months I got a MacBook and I’ve never looked back.
BSD is fun (not counting MacOS in the set there), but no, my Unix experiences have been universally legacy hardware oversubscribed and undermaintained. Not my favorite place to spend any time.
It seems they also update the base memory on MacBook Air:
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
Wow, I didn't expect them to update the older models to start at 16GB and no price increase. I guess that is why Amazon was blowing the 8GB models out at crazy low prices over the past few days.
Costco was selling MB Air M2 8 GB for $699! Incredible deal.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
They did? The tweet that announced stuff from the head of marketing did not mention 3 days.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
"This is a huge week for the Mac, and this morning, we begin a series of three exciting new product announcements that will take place over the coming days," said Apple's hardware engineering chief John Ternus, in a video announcing the new iMac.
That's disappointing. I was expecting a new Apple TV because mine needs replacement and I don't really feel inclined to get one that's due for an upgrade very soon.
The current-gen Apple TV is already overpowered for what it does, and extremely nice to use. I can think of very few changes I would like to see, and most of them are purely software.
Mine has 128GB of onboard storage... but Apple still bans apps from downloading video, which annoys me.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
I use a dedicated NAS as a Plex server + Plex app on Apple TV itself for local streaming, which generally works fine. Infuse app can also index and stream from local sources.
But there are some cases like e.g. watching high-res high-FPS fractal zoom videos (e.g. https://www.youtube.com/watch?v=8cgp2WNNKmQ) where even brief random skipped frames from other things trying to use WiFi at the same time can be really noticeable and annoying.
It would make more sense to discontinue the smaller model along with some other updates to the line. Or in other words, Air won't receive any other updates this week unfortunately.
See that’s the thing. Given that somehow you need 1TB to get the matte screen, I feel like Apple is using it as a way to upsell. It would indicate that perhaps Apple won’t offer a matte MacBook Air.
It'll be interesting to see the reaction of tech commentators about this. So many people have been screaming at Apple to increase the base RAM and stop price gouging their customers on memory upgrades. If Apple Intelligence is the excuse the hardware team needed to get the bean counters on board, I'm not going to look a gift horse in the mouth!
It wouldn't surprise me if people typically use more storage on their phone than their computer. The phone should probably have a higher base storage than the base storage of their laptops.
Extremely interesting point. My initial reaction to your comment is that it is a crazy thing to say, but the more I think about it the more I agree with you. On my phone is where I have tons of 4k 30/60FPS videos, high resolution photos (with live), and videos downloaded on Netflix and YouTube.
On my Mac I don't have any of these things, it's mostly for programming and some packages. I'm almost always connected to Wi-Fi (except on planes) so I don't really need any photos or videos.
The only people that I see have high storage requirements on Macs are probably video/media creators? As a programmer I'm totally fine with 512GB, but could probably live with 256GB if I wanted to be super lean.
Thanks to your comment. I persuaded my friend who purchased an M3 Air 24GB recently and we got 200$ back (Remuneration for price drop valid for 14 days after the date of DELIVERY) where we live
The only older configs that Apple sells are the M2 and M3 Airs, which were bumped. Everything else is now on M4, or didn't have an 8gb base config (Mac Studio, Mac Pro)
Ohh, good catch. Sneaking that into the MBP announcement. I skimmed the page and missed that. So a fourth announcement couched within the biggest of the three days.
I've seen a lot of people complaining about 8GB but honestly my min spec M1 Air has continued to be great. I wouldn't hesitate to recommend a refurb M1 8GB Air for anyone price conscious.
Yeah, this one caught me off guard. We just purchased a MacBook Air in the last month and if we bought the same one now, we would save $200. Apple support would not price match/correct that, so we will be returning it and purchasing anew.
I think spec-wise the Air is good enough for almost everyone who isn't doing video production or running local LLMs, I just wish it had the much nicer screen that the Pro has. But I suppose they have to segregate the product lines somehow.
Well, the issue for me with memory on these new models is that for the Max, it ships with 36GB and NO expandable memory option. To get more memory that's gated behind a $300 CPU upgrade (plus the memory cost).
My one concern is that nano-texture apple displays are a little more sensitive to damage, and even being super careful with my MBPs I get the little marks from the keyboard when you carry the laptop with your hand squeezing the lid and bottom (a natural carry motion).
Love the nano-texture on the Studio Display, but my MacBooks have always suffered from finger oil rubbing the screen from the keys. Fingerprint oil on nano-texture sounds like a recipe for disaster.
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
I put a thin screen cleaner/glasses cleaner cloth on the keyboard whenever I close the lid. That keeps the oils off the screen as well as prevents any pressure or rubbing from damaging the glass.
If your goal is to sell more MBPs (and this is marketing presentation) then, judging by the number of comments that have the phrase "my M1" and the top comment, it seems like M1 vs M4 is the right comparison to make. Too many people are sticking with their M1 machines. Including me.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
It does and it gets even worse when you realize those stats are only true under very specific circumstances, not typical computer usage. If you benchmarked based on typical computer usage, I think you'd only see gains of 5% or less.
Anyone know of articles that deep dive into "snappiness" or "feel" computer experiences?
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
So far I’m only reading comments here about people wow’d by a lot of things it seemed that M3 pretty much also had. Not seeing anything new besides “little bit better specs”
The M4 is architecturally better than the M3, especially on GPU features IIRC, but you’re right it’s not a total blow out.
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
I have to admit, 4 generations in, 1.8x is decent but slightly disappointing all the same.
I'd really like to justify upgrading, but a $4k+ spend needs to hit greater than 2x for me to feel it's justified. 1.8x is still "kind of the same" as what I have already.
To run LLMs locally (Ollama/LLM Notebook), you want as much memory as you can afford. For actually training toy models yourself for learning/experiments in my experience it doesn't matter much. PyTorch is flexible.
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
I’m really excited about the nano-texture display option.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
Do you actually have to wipe the screen with the included special cloth? The screen on all of the macbooks that I've had usually get oily patches because of the contact with keycaps, so I have to wipe the screen regularly.
I have Pro Display XDR with nano coating and the manual definitely says to only use their special cleaning cloth (or with some isopropyl alcohol). The standard coating might not need it though.
How is the contrast? The HDR content? Any downsides?
I will upgrade to M4 Pro and really hate the glare when I travel (and I do that a lot) but at the same time I don't want to lose any quality that the MBP delivers which is quite excellent imho
I love mine, it has a fresh battery OEM battery as well. Runs the latest OS with OpenCore Legacy. But it's starting to get a bit annoying. Usable, but it is starting to feel slowish, the fan kicks up frequently.
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
If I still had my 2015 I would have applied some liquid metal TIM by now, I did a paste refresh and that worked very well to get the fan under control.
If it's got a full function row, it will probably work just fine under Linux. My 2014 MBP chugged pretty hard with OpenCore but handles modern Linux distros much better.
Which MacOS version? I upgraded to a newer one and it crawled to a halt, it's unusable now. UI is insanely laggy. It's sitting in a drawer gathering dust now
I have a 16" M1 Pro with 16 gigs of ram, and it regularly struggles under the "load" of Firebase emulator.
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
This kind of sounds like someone is abusing perf cores and high priority threading in your stack. iirc, on MacOS audio workgroup threads are supposed to be scheduled with the highest (real time) priority on p cores, which shouldn't have issues under load, unless someone else is trying to compete at the same priority.
There is some discussion online on whether this happens when you have a Rosetta app running in the background somewhere (say a util you got via Homebrew, for example).
Even when I remove all "Intel" type apps in activity monitor, I still experience the issue though.
This happens whenever I load up one of our PyTorch models on my M1 MBP 16gb too. I also hate the part where if the model (or any other set of programs) uses too much RAM the whole system will sometimes straight up hang and then crash due to kernel watchdog timeout instead of just killing the offender.
I’ve had something similar happen as a bug when I was using a python sound device and calling numpy functions inside its stream callback. Took me a long time to figure out that numpy subroutines that drop the GIL would cause the audio stream to stall.
Whoa! I've been so annoyed by this for years, so interesting that you figured it out. It's the kind of inelegance in design that would have had Steve Jobs yelling at everyone to fix, just ruins immersion in music and had no obvious way to fix.
That sounds like an app issue, it might be doing non-realtime-safe operations on a realtime thread. But generally speaking, if you have an issue, use feedback assistant.
wot, m8? Only Apple will call a 12 megapixel camera “advanced”. Same MPs as an old iPhone 6 rear camera.
Aside from that, it’s pretty much the same as the prior generation. Same thickness in form factor. Slightly better SoC. Only worth it if you jump from M1 (or any Intel mbp) to M4.
Would be godlike if Apple could make the chip swappable. Buy a Mac Studio M2 Ultra Max Plus. Then just upgrade SoC on an as needed basis.
Would probably meet their carbon neutral/negative goals much faster. Reduce e-waste. Unfortunately this is an American company and got to turn profit. Profit over environment and consumer interests.
You’re comparing cameras against different product segments.
Laptop cameras are significantly smaller in all dimensions than phone cameras. Most laptop cameras are 1-4MP. Most are 720p (1MP), and a few are 1080p (2MP). The previous MacBook was 1080p
For reference, a 4k image is 8MP.
12MP is absolutely a massive resolution bump, and I’d challenge you to find a competitive alternate in a laptop.
I feel like if they pushed Win32/Gaming on Apple Mx hardware it'd give at least a single reason for people to adopt or upgrade their devices to new models. I know for sure I'd be on board if everything that ran on my steam deck ran on a mac game wise, since that's holding me back from dropping the cash. I still think I'll get a mini though.
Valve is trying to obsolete Windows, so they can prevent Microsoft from interfering with Steam. Apple could team up with them, and help obsolete Windows for a very large percentage of game-hours.
There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.
My only explanations for the lack of gaming support (see historical lack of proper OpenGL support) while still supporting high end graphics use cases (film editing, CAD, visual effects) are:
1) Either Apple wants to maintain the image of the Macbook as a "serious device", and not associate itself with the likes of "WoW players in their mom's basement".
2) Microsoft worked something out with Apple, where Apple would not step significantly on the gaming market (Windows, Xbox). I can't think of another reason why gaming on iOS would be just fine, but abysmal on MacOS. Developers release games on MacOS _despite_ the platform.
Steve Jobs was historically against gaming on apple devices and, I believe, went so far as to try to remove them from the Apple Store. Apple is only recently starting to introduce gaming seriously back into the platform.
Would be incredibly fascinating to consider what if Bungie was never bought by Microsoft and Halo ended up a Mac title first. It would've severely capped the influence of the game (and maybe its quality), even after it would have been ported to PC. Would Halo have even been imported to Xbox? On the flip side, if it somehow managed to capture considerable success- would it have forced Jobs and Apple to recognize the importance of the gaming market? Either way, the entire history of video games would be altered.
More megapixels on a tiny sensor does not make it more advanced. At a certain point it only makes it worse. That doesn't tell you anything about the quality of the image. There is way more to digital cameras than pixel count.
Especially because pixel count is a meaningless metric by itself. 12MP is the same as a Nikon D3, which if it could replicate the results of I would be happy with!
Megapixels is nothing more than the number of sample points. There's so much more to image quality than the number of samples.
I blame the confusion to PC&Android marketing people who were pushing for years and years the idea that the higher the megapixel digits the better the camera is. Non-Apple customers should be really pissed of for the years of misinformation and indoctrination on false KPI.
The marketing gimmicks pushed generations of devices to optimize for meaningless numbers. At times, even Apple was forced to adopt those. Such a shame.
That's annoying. I really want to fully remove lightning connectors from my life, but, my existing magic* devices work fine and will probably work fine for another decade or two.
It pains me deeply that they used Autodesk Fusion in one of the app screenshots. It is by far the worst piece of software I use on Mac OS.
Wish the nano-texture display was available when I upgraded last year. The last MacBook I personally bought was in 2012 when the first retina MBP had just released. I opted for the "thick" 15" high-res matte option. Those were the days...
I don’t think it will “feel” much faster like the Intel -> M1 where overall system latency especially around swap & memory pressure got much much better.
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I have a MBP M1 16GB at home, and a MBP M3 128GB at work. They feel the same: very fast. When I benchmark things I can see the difference (or when fiddling with larger LLM models), other than that, the M1 is still great and feels faster and more enjoyable than any Windows machine I interact with.
I guess it’s only worth it for people who would really benefit from the speed bump — those who push their machines to the limit and work under tight schedules.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
I do a lot of (high-end mirrorless camera, ~45MP, 14 bits/pixel raw files) photo processing. There are many individual steps in Photoshop, Lightroom, or various plug-ins that take ~10 seconds on my M1 Max MBP. It definitely doesn't feel fast. I'm planning to upgrade to one of these.
No support for M3 or M4 powered machines currently.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
btw, there is a recent interview with an Asani dev focusing on GPUs, worth a listen for those interested in linux on apple silicon. The reverse engineering effort required to pin down the GPU hardware was one of the main topics.
For many years I treated Windows or macOS as a hypervisor - if you love Linux but want the Mac hardware, instant sleep & wake, etc, putting a full screen VM in Parallels or similar is imo better than running Linux in terms of productivity, although it falls short on “freedom”.
I do the same thing, but there are two big caveats:
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
$ swift repl
Welcome to Apple Swift version 6.0.2 (swiftlang-6.0.2.1.2 clang-1600.0.26.4).
Type :help for assistance.
1> import Virtualization
2> VZGenericPlatformConfiguration.isNestedVirtualizationSupported
$R0: Bool = false
Have anyone tried it recently, specifically the trackpad? I tried the Fedora variant a few months ago on my M1 Macbook and it was horrible to use the trackpad, it felt totally foreign and wrong.
I feel you, but Apple's trackpad prowess is not an easy thing to copy. It's one of those things I never expect anyone else to be able to replicate the level of deep integration between the hardware and software.
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
Insane cost for the amount of storage and RAM. I mean, year over year for Apple, awesome! Compared to the rest of the brands... so ridiculously expensive. Watching the price climb to 5K as you add in the new normal for hardware specs is absurd.
Nice to see they increased the number of performance cores in the M4 Pro, compared to the M3 Pro. Though I am worried about the impact of this change on battery life on the MBPs.
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
I'm curious why they decided to go this route, but glad to see it. Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
> Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
> Curious how you're measuring this. Can you see it in Activity Monitor?
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
Got the money, are in the consumerism camp: Switch to latest model every year because the camera island changed 5mm.
Got the professional need in games or video and your work isn't covering your device: Switch to new model every couple of generations.
Be me: I want to extend the lifecycle of things I use. Learn how to repair what you own (it's never been as easy), be aware of how you can work in today's world (who needs laptop RAM if I can spin up containers in the cloud) - I expect to not upgrade until a similarly stellar step up in the category of Intel to Apple Silicone comes along.
All past Mx versions being mostly compared to Intel baselines: Boring.
M4 1.8 times faster than M1 Pro: Nice, but no QoL change. For the few times I might need it, I can spin up a container in the cloud.
I update largely based on non performance criteria:
- new display tech
- better wireless connectivity
- updated protocols on ports (e.g., support for higher res displays
and newer displayport/hdmi versions)
- better keyboard
- battery life
Once a few of those changes accumulate over 4+ generations of improvements that’s usually the time for me to upgrade.
My laptops so far: first 2008 plastic macbook, 2012 macbook pro, 2015 macbook pro, and M1 pro 16 currently. I skipped 2016-2020 generation which was a massive step backwards on my upgrade criteria, and updated to 2015 model in 2016 once I realized apple has lost their marbles and has no near plans on making a usable laptop at the time.
Also getting a maxed out configuration really helps the longevity.
On major remodels or with compelling features. I had an i9 MacBook Pro and then upgraded to anM1 MacBook Pro because it was a major leap forward. However, I will wait until the MacBook Pro is redesigned yet again (maybe thinner and lighter as I travel a lot and carry-on weight is limited), apparently in 2026 or so with OLED and other features, rumors say.
Pick a daily cost you’re comfortable with. If you’re contracting at say $500/day, how much are you willing to spend on having a responsive machine? $10? $20?
Multiply it out: 220 work days a year * $10/day is $2200 a year towards your laptop.
Depends if it is a personal machine or paid by your company. 5+ years is what I generally expect from an apple laptop (been using them since around 2007-2009) if I own. For an M1-3 that could be a bit longer. If it is paid by your company, then whenever you have the budget :)
The 2014 model I bought in early 2015 still works, though the battery is dodgy. I did get the motherboard replaced in 2020 which was pricey, but much cheaper than a new machine.
Is there some reason your current computer isn't working for you? If not, why upgrade? Use it as long as you can do so practically & easily.
On the other extreme, I knew someone who bought a new MBP with maximum RAM specs each year. She'd sell the old one for a few hundred less than she paid, then she always had new hardware with applecare. It was basically like leasing a machine for $400/yr.
My previous Macbook was a Pro model from 2015, I waited 6 years to finally upgrade to an M1 Air because of the awful touchbar models they had in between (though I'm still using the 2015 Pro for personal stuff, in fact right now. It's upgraded to the latest macOS using OpenCore and it still runs great). But I would say upgrade every 3-5 years depending on heavy a professional user you are.
Because it made the esc key useless for touch typists and because, as a vi user, I hit esc approximately a bazillion times per day I mapped caps lock to esc.
Now my fingers don't travel as far to hit esc.
I still use that mapping even on my regular keyboards and my current non-touch-bar macs.
People have different passions, I like computers. If I feel a new Mac is going to be fun for whatever reason, I consider upgrading it.
Performance wise they last a long time, so I could keep them way longer than I do, but I enjoy newer and more capable models.
You can always find someone to buy the older model. Macs have a great second hand market.
I'm using them for several years - I still have a Mac mini (from 2012) and an iMac Pro (from 2017) running. I also get a company Macbook which I can upgrade every three years.
But there is also another strategy: get a new Mac when they come out and sell it before/after the next model appears. There is a large market for used Macs. A friend of mine has been doing this for quite some time.
It's hard to imagine ay reason why I would not want to keep upgrading to a new MPB every few years -- my M3 MBP is by far the best laptop I've owned thanks to the incredible battery life.
Of course I'm rooting for competition, but Apple seems to be establishing a bigger and bigger lead with each iteration.
I don’t see the yearly releases as saying you have to upgrade. Rather, having a consistent cadence makes it easier for the supply chain, and the short iteration time means there’s less pressure to rush something in half-baked or delay a release.
My M1 laptop from early 2022 is too good for me to care about upgrading right now, I loaded it up with 64GB ram and it's still blazing. What benefit would I really notice? My heavy apps loading a couple of seconds faster?
What’s amazing is that in the past I’ve felt the need to upgrade within a few years.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
> Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
It's true that Linux may be a bit better in some cases, if you have a system that has good Linux support, but I think in most cases it should never make a very substantial difference. On some of the newer Intel laptops, there are still missing power management features anyways, so it's hard to compare.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
I changed the wording to be "booting directly" to clarify that I'm not including VMs. If I have to explain why that matters I guess I can, but I am pretty sure you know.
If the roles were reversed would you still need an explanation? e.g. If I could run macOS inside of a VM on Windows and run things like Final Cut and XCode with sufficient performance, would you think there's no benefit to being able to boot macOS natively?
Booting natively means you need real drivers, which don't exist for Windows on Mac as well as for macOS on PC. It'd be useless. Just use the VM, it's good.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
The question was a hypothetical. What if the macOS VM was perfect? If it was perfect, would it then not matter if you couldn't just boot into macOS?
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
If we're mostly concerned about CPU grunt, it's really hard to question the Ryzen 7040, which like the M1, is also not the newest generation chip, though it is newer than the M1 by a couple of years. Still, comparing an M1 MacBook Pro with a Framework 16 on Geekbench:
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
Ultimately it'll be subjective, but the fans don't really spin up on my Framework 16 unless I push things. Running a game or compiling on all cores for a while will do the trick. The exact battery life, thermals and noise will be heavily dependent on the laptop; the TDP of modern laptop CPUs is probably mostly pretty comparable so a lot of it will come down to thermal design. Same for battery life and noise, depends a lot on things other than the CPU.
>Laptops in general are just better than they used to be, with modern CPUs and NVMe disks.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
Regarding Youtube, Google is also waging a silent war against Invidious. It's to the point that even running helper scripts to trick Youtube isn't enough (yet). I can't imagine battling active and clever adversaries speeds up Youtube page loads as it runs through its myriad checks that block Invidious.
I only do coding & browsing so maybe I'm a weak example but I find this even with my pretty old Intel laptops these days.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
Depends. I used to offload almost all compilation tasks, but now I only really do this if it's especially large. If I want to update my NixOS configuration I don't bother offloading it anymore. (NixOS isn't exactly Gentoo or anything, but I do have some overrides that necessitate a decent amount of compilation, mainly dogfooding my merge requests before they get merged/released.)
I am on Intel TGL currently and can't wait for Strix Halo next year. That is truly something else, it's nothing we have seen in notebooks before iGPU wise.
I've had a couple of Tiger Lake laptops, a Thinkpad and I believe my Surface Laptop 4. Based on my experience with current AMD mobile chipsets, I can only imagine the Strix Halo will be quite a massive uplift for you even if the generational improvements aren't impressive.
I've owned an M1 MBP base model since 2021 and I just got an M3 Max for work. I was curious to see if it "felt" different and was contemplating an upgrade to M4. You know what? It doesn't really feel different. I think my browser opens about 1 second faster from a cold start. But other than that, no perceptible difference day to day.
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
FWIW I think that's more the core count than anything. I have a M1 Max as a personal machine and an M3 Max at work and while the M3 Max is definitely faster, it isn't world-beating.
My current work machine is M1 Max 64Gb and it's the fastest computer I've ever used. Watching rust code compile makes me laugh out loud it's so quick. Really curious what the newer ones are like, but tbh I don't feel any pressure to upgrade (could just be blissfully ignorant).
I think most of that difference is going to be the huge increase in performance core count between the base chip and the Max (from 4 to 12). The RAM certainly doesn't hurt though!
I went from 12 to 15 pro max, the difference is significant. I can listen to Spotify while shooting from the camera. On my old iPhone 12, this is not possible.
Test Spotify against YouTube Music (and others) - I personally see no reason for Spotify when I have YouTube Premium, which performs with less overhead.
> I wonder what it will take to make Mac/iOS feel faster
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
16 pro has a specialized camera button which is a game changer for street / travel photography. I upgraded from 13 pro and use that. But no other noticeable improvements. Maybe Apple intelligence summarizing wordy emails.
I think the only upgrade now is from a non-Pro to Pro, since a 120Hz screen is noticeably better than a 60Hz screen (and a borderline scam that a 1000 Euro phone does not have 120Hz).
I realize this isn't your particular use case. But with newer iPhones, you can use USB-C directly for audio. I've been using the Audio Technica ATH-M50xSTS for a while now. The audio quality is exceptional. For Slack/Team/Zoom calls, the sidetone feature plays your voice back inside the headphones, with the level being adjustable via a small toggle switch on the left side. That makes all the difference, similar to transparency/adaptive modes on the AirPod Pro 2s (or older cellphones and landlines).
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
> Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
I have custom scripts,
Ad blocking without VPNs, Application firewalls.
I've found compile times on large C++ code bases to be the only thing I really notice improving. I recently upgraded my work machine from a 2017 i7 to a shiny new Ryzen 9 9950x and my clean compile times went from 3.5 minutes to 15 seconds haha. When I compile with an M2 Max, it's about 30s, so decent for a laptop, but also it was 2x the price of my new desktop workstation.
The biggest difference I’ve seen is iPad Sidecar mode works far more reliably with the M3 Max than the M1 Max.
There have been incremental improvements in speed and nits too, but having Sidecar not randomly crash once a day once on M3 was very nice.
Can confirm. I have an M2 Air from work and an M1 Pro for personal, and tbh, both absolutely fly. I haven't had a serious reason to upgrade. The only reason I do kind of want to swap out my M1 Pro is because the 13" screen is a wee small, but I also use the thing docked more often than not so it's very hard to justify spending the money.
On the other side, as someone doing a lot of work in the GenAI space, I'm simultaneously amazed that I can run Flux [dev] on my laptop and use local LLMs for a variety of tasks, while also wishing that I had more RAM and more processing power, despite having a top of the line M3 max MBP.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
> I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP.
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
> I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
I try to avoid docker in general during local dev and luckily it has worked out for me even with microservice architectures. It reduces dramatically CPU and RAM needs and also reduces cycle time.
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
> With your M3 Pro, the HDR label should appear in Chrome and Safari.
Ahh. Not Firefox, of course.
Thanks, I just ran a random nature video in Safari. It was pretty. The commercials before it were extremely annoying though. I don't think it's even legal here to have so many ads per minute of content as Google inserts on youtube.
For me faster refresh rate is noticeable on phone or ipad where you scroll all the time. On a laptop you don't have that much smooth scrolling. For me it's a non issue on laptop, not even once I wished it had faster refresh. While I always notice when switching between Pro and non Pro iPad.
I find 60Hz on the non-Pro iPhone obnoxious since switching to 120Hz screens. On the other hand, I do not care much about 60Hz when it comes to computer screens. I think touch interfaces make low refresh rates much more noticeable.
No doomscrolling at all. Even when switching between home screens is like it's dropping frames left and right (it's not of course, but that's what it looks like coming from 120Hz). A Galaxy A54 that we still have in the house that was just over 300 Euro feels much smoother than my old iPhone 15 that cost close to 1000 Euro because it has a 120Hz screen.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
The Galaxy must be new. In my experience Android phones get extremely laggy [1] as they get old and the 120 Hz refresh won't save you :)
I just noticed that I don't really try to follow the screen when I scroll down HN, for example. Yes it's blurry but I seem not to care.
[1] Source: my Galaxy something phone that I keep on my desk for when I do Android development. It has no personal stuff on it, it's only used to test apps that I work on, and even that isn't my main job (nothing since early spring this year for example). It was very smooth when I bought it, now it takes 5+ seconds to start any application on it and they stutter.
A lot of my work can be easily done with a Celeron - it's editing source, compiling very little, running tests on Python code, running small Docker containers and so on. Could it be faster? Of course! Do I need it to be faster? Not really.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
I still use Ivy Bridge and Haswell workstations (with Linux, SSD and discrete GPU) as my daily drivers and for the things I do they still feel fast. Honestly a new Celeron probably beats them performance wise.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
My 2019 i9 flagship MBP is just so, so terrible, and my wife's M1 MacBook Air is so, so great. I can't get over how much better her computer is than mine.
It's so nice being able to advise a family member who is looking to upgrade their intel Mac to something new, and just tell them to buy whatever is out, not worry about release dates, not worry about things being out of date, and so on.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
I would normally never upgrade so soon after getting an M1 but running local LLMs is extremely cool and useful to the point where I'd want the extra RAM and CPU to run larger models more quickly.
I'm bumping from a still-excellent M1 MAX / 64GB to M4 MAX / 128GB, mostly for local GenAI. It gives me some other uplift and also enables me to sell this system while it's still attractive. I'm able to exhaust local 7B models fairly easily on it.
I dont think this has anything to do with the hardware. I think we have entered an age where users in general are not upgrading. As such, software can't demand more and more performance. The M1 came out at a time where mostly all hardware innovation had staggered. Default RAM in a laptop has been 16G for over 5 years. 2 years ago, you couldn't even get more than 16 in most laptops. As such, software hardware requirements havent changed. So any modern CPU is going to feel overpowered. This isn't unique to M1's.
That’s because today’s hw is perfectly capable of running tomorrow’s software at reasonable speed. There aren’t huge drivers of new functionality that needs new software. Displays are fantastic, cellular speeds are amazing and can stream video, battery life is excellent, UIs are smooth with no jankiness, and cameras are good enough.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Until everything starts running local inference. A real Siri that can operate your phone for you, and actually do things like process cross-app conditions ("Hey Siri, if I get an email from my wife today, notify me, then block out my calendar for the afternoon.") would use those increased compute and memory resources easily.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
That's a very big maybe. The LLM experience locally is currently very very different from the hosted models most people play with. The future is still very uncertain.
Yep, the same, M1 Pro from 2021. It's remarkable how snappy it still feels years later, and I still virtually never hear the fan. The M-series of chips is a really remarkable achievement in hardware.
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
I agree with you about not needing to upgrade but, it still stands that IMHO Apple is better off with upgrading or even having the need to upgrade with competition. (Also it's really good that Macs now have 16GB of ram by default). As I have had my M1 14.2 Max I believe that the only reason I would want to upgrade is that I can configure it with 128GB of ram which allows you to load newer AI models on device.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
I've had Macs before, from work, but there is something about the M1 Pro that feels like a major step up.
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
Same. I used to upgrade every 1.5 years or so. But with every Apple Silicon generation so far I have felt that there are really no good reasons to upgrade. I have a MacBook M3 Pro for work, but there are no convincing differences compared to the M1 Pro.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
And M1 from 4 years ago instead of M3 from last year; while a 2x speed improvement in the benchmarks they listed is good, it also shows that the M series CPUs see incremental improvements, not exponential or revolutionary. I get the feeling - but a CPU expert can correct me / say more - that their base design is mostly unchanged since M1, but the manufacturing process has improved (leading to less power consumption/heat), the amount of cores has increased, and they added specialized hardware for AI-related workloads.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
Their sales pitch when they released the M1 was that the architecture would scale linearly and so far this appears to be true.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
Same feeling. The jump from all the previous laptops I owned to an M1 was an incredible jump. The thing is fast, has amazing battery life and stays cold.
Never felt the need to upgrade.
I have an MBP M1 Max and the only time I really feel like I need more oomph is when I'm doing live previews and/or rendering in After Effects. I find myself having to clear the cache constantly.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
probably the next update wave is coming from the need of AI features for more local memory and compute. The software is just not there yet in usual tasks but it's just a question of time I guess. Of course there will be the pressure to do that in the cloud as usual, but local compute will always remain a market.
and probably it's good that at least one of the big players has a business model that supports driving that forward
I think regretting Mac upgrades is a real thing, at least for me. I got a 32G Mac mini in January to run local LLMs. While it does so beautifully, there are now smaller LLMs that run fine on my very old 8G M1 MacBook Pro, and these newer smaller models do almost all of what I want for NLP tasks, data transformation, RAG, etc. I feel like I wasted my money.
Which ones in particular? I have an M2 air with 8GB, and doing some RAG development locally would be fantastic. I tried running Ollama with llama3.2 and it predictably bombed.
I feel the same way about my M1 Macbook Air ... it's such a silly small and powerful machine. I've got money to upgrade, I just have no need. It's more than enough for even demanding Logic sessions and Ollama for most 8b models. I love it.
> Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
I think the last decade had an explosion in the amount of resources browsers needed and used (partly workloads moving over, partly moving to more advanced web frameworks, partly electron apps proliferating).
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
I have exactly the same experience, usually after 3 years I'm desperate for new Mac but right now I genuinely think I'd prefer not to change. I have absolutely no issues with my M1 Pro, battery and performance is still great.
But this ad is specifically for you! (Well, and those pesky consumers clinging on to that i7!):
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
The only reason I'd want to upgrade my M1 Pro MBP is because I kind of need more RAM and storage. The fact that I'm even considering a new laptop just for things that before could have been a trivial upgrade is quite illuminating.
100% agree on this. Ive had this thing for 3 years and I still appreciate how good it is. Of course the M4 tingles my desire for new cool toys, but I honestly don´t think I would notice much difference with my current use.
I feel exactly the same. The one thing that would get me to pull the trigger on a newer one is if they start supporting SVE2 instructions, which would be super useful for a specific programming project I’ve been playing with.
I’m using the M3 Air 13 in (splurged for 24 GB of RAM, I’m sure 16 is fine) to make iOS apps in Xcode and produce music in Ableton and it’s been more than performant for those tasks
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
I am replacing a Dell laptop because the case is cracking, not because it's too slow (it isn't lightning fast, of course, but it sure is fast enough for casual use).
Tbf, the only thing I miss with my M2 MacBook is the ability to run x86_64 VM’s with decent performance locally.
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Yup, honestly the main reason I'd like to upgrade from my M1 MBA is the newer webcams are 1080p instead of 720p, and particularly much better in low light like in the evening.
If you're in the ecosystem get an iphone mount - image quality is unreal compared to anything short of some fancy DSLR setup - it is some setup but not much with magnets in iphone.
when the hardware wait time is the same as the duration of my impulsive decisions i no longer have a hardware speed problem, i have a software suggestion problem
I got an MBP M1 with 32gb of RAM. It'll probably be another 2-3 years or longer before I feel the pressure to upgrade if not longer. I've even started gaming (something I dropped nearly 20 years ago when I switched to mac) again due to Geforce Now, I just don't see the reason.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
I hate to say it but that's like a boomer saying they never felt the need to buy a computer, because they've never wished their pen and paper goes faster. Or a UNIX greybeard saying they don't need a Mac since they don't think its GUI would make their terminal go any faster. If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade. A Macbook M1 can't run half the stuff posted on Hugging Face these days. Even my 128gb Mac Studio isn't nearly enough.
> If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
I think the difference is that AI is a very narrow niche/hobby at the moment. Of course if you're in that niche having more horsepower is critical. But your boomer/greybeard comparisons fall flat because they're generally about age or being set in your ways. I don't think "not being into AI image generation" is (currently) about being stuck in your ways.
To me it's more like 3d printing as a niche/hobby.
Playing with them locally? Yes, of course it's a niche hobby. The people doing stuff with them that's not either playing with them or developing not just an "AI" product, but a specific sort of AI product, are just using ChatGPT or some other prepackaged thing that either doesn't run locally, or does, but is sized to fit on ordinary hardware.
< 1% of all engagement with a category thing is niche/hobby, yes.
I get that you're probably joking, but - if I use Claude / ChatGPT o1 in my editor and browser, on an M1 Pro - what exactly am I missing by not running e.g. HF models locally? Am I still the greybeard without realising?
Using the term "bro" assumes that all AI supporters are men. This erases the fact that many women and nonbinary people are also passionate about AI technology and are contributing to its development. By using "AI bro" as an insult, you are essentially saying that women and nonbinary people are not welcome in the AI community and that our contributions don't matter. https://www.reddit.com/r/aiwars/comments/13zhpa7/the_misogyn...
Is there an alternative term you would prefer people to use when referring to a pattern of behavior perceived as a combination of being too excited about AI and being unaware (perhaps willfully) that other people can be reasonably be much less interested in the hype? Because that argument could definitely benefit from being immune to deflections based on accusations of sexism.
When I see that someone is excited about something, I believe in encouraging them. If you're looking for a more polite word to disparage people who love and are optimistic about something new, then you're overlooking what that says about your character. Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
> Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
That's thoroughly unconvincing. That kind of talk is exactly what so many people are tired of hearing. Especially if it's coming from technically-minded people who don't have any reason to be talking like PR drones.
What makes you think I care about convincing you? These days every shot caller on earth is scrambling to get piece of AI. Either by investing in it or fighting it. You come across as someone who wants to hate on AI. Haters aren't even players. They're NPCs.
So people who aren't obsessed with AI to your liking are:
- boomer luddites
- primitive single-celled organisms
- NPCs
And even people who are enthusiastic about AI but aren't fanatical about running it locally get scorn from you.
I can understand and forgive some amount of confirmation bias leading you to overestimate the importance and popularity of what you work on, but the steady stream of broad insults at anyone who even slightly disagrees with you is dismaying. That kind of behavior is wildly inappropriate for this forum. Please stop.
That’s interesting because I would’ve thought having strong local compute was the old way of thinking. I run huge jobs that consume very large amounts of compute. But the machines doing the work aren’t even in the same state I’m in. Then again maybe I’m even older as I’m basically on the terminal server / mainframe compute model. :)
I work with AI models all day every day, keep up with everything, love frontier tech, I love and breathe LLMs. And I, like OP, haven't seen the need to upgrade from the M1 MBP because it runs the small 1-7B models just fine, and anything bigger I want on some GPU instance anyway, or I want a frontier model which wouldn't run on the newest and biggest MBP. So it's not just us Boomers hating on new stuff, the M series MacBooks are just really good.
Given that models are only going to get larger, and the sheer amount of compute required, I think the endgame here is dedicated "inference boxes" that actual user-facing devices call into. There are already a couple of home appliances like these - NAS, home automation servers - which have some intersecting requirements (e.g. storage for NAS) - so maybe we just need to resurrect the "home server" category.
I agree, and if you want to have the opportunity to build such a product, then you need a computer whose specs today are what a home server would have in four years. If you want to build the future you have to live in the future. I'm proud to make stuff most people can't even run yet, because I know they'll be able to soon. That buys me time to polish their future and work out all the bugs too.
So every user of a computer that doesn't create their own home-grown ML models is a boomer? This can't possibly be a generational thing. Just about everyone on the planet is at a place in their life where they don't make their own AIs.
Eventually as the tools for doing it become better they'll all want to or need to. By then, most computers will be capable of running those tools too. Which means when that happens, people will come up another way to push the limits of compute.
I also have an M1 Pro MBP and mostly feel the same. The most tempting thing about the new ones is the space black option. Prior to the M1, I was getting a new laptop every year or two and there was always something wrong with them - butterfly keyboard, Touch Bar etc. This thing is essentially perfect though, it still feels and performs like a brand new computer.
Same boat—I'm on a lowly M1 MacBook Air, and haven't felt any need to upgrade (SwiftUI development, video editing, you name it), which is wild for a nearly 4 year-old laptop.
Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
So the intel era is not Apple products? Butterfly keyboard is not an Apple invention?
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
It's expected Intel-based Macs would lose value quickly considering how much better the M1 models were. This transition was bigger than when they moved from PowerPC to Intel.
One complicating factor in the case of the Intel Macs is that an architectural transition happened after they came out. So they will be able to run less and less new software over the next couple of years, and they lack most AI-enabling hardware acceleration.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
Similar for me. MacBook Air M1 (8 cpu / 8 gpu; 16 GB RAM)...running in or out of clamshell with a 5k monitor, I rarely notice issues. Typically, if I'm working very inefficiently (obnoxious amount of tabs with Safari and Chrome; mostly web apps, Slack, Zoom, Postman, and vscode), I'll notice a minor lag during a video call while screen sharing...even then, it still keeps up.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
> Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
Most retailers have had the older models on closeout for a few weeks now. Best Buy, Amazon and Costco have had the M3 models for a few hundred off depending on models.
Watch SlickDeals. I think it was this time last year where lots of refurbs/2 generation old machines were going for massive discounts. Granted they were M1 machines, but some had 64GB RAM and 4TB drives for like $2700. Microcenter and B&H are good ones to watch as well.
The M-series macbooks depreciate in value far slower than any of the Intel models. M1 base models can still sell for nearly $1k. It's difficult to find a really good deal.
Question without judgement: why would I want to run LLM locally? Say I'm building a SaaS app and connecting to Anthropic using the `ai` package. Would I want to cut over to ollama+something for local dev?
Data privacy-- some stuff, like all my personal notes I use with a RAG system, just don't need to be sent to some cloud provider to be data mined and/or have AI trained on them
Does anyone understand this claim from the press release?
> M4 Max supports up to 128GB of fast unified memory and up to 546GB/s of memory bandwidth, which is 4x the bandwidth of the latest AI PC chip. This allows developers to easily interact with large language models that have nearly 200 billion parameters.
Having more memory bandwidth is not directly helpful in using larger LLM models. A 200B param model requires at least 200GB RAM quantized down from the original precision (e.g. "bf16") to "q8" (8 bits per parameter), and these laptops don't even have the 200GB RAM that would be required to run inference over that quantized version.
How can you "easily interact with" 200GB of data, in real-time, on a machine with 128GB of memory??
Wouldn't it be incredibly misleading to say you can interact with an LLM, when they really mean that you can lossy-compress it to like 25% size where it becomes way less useful and then interact with that?
(Isn't that kind of like saying you can do real-time 4k encoding when you actually mean it can do real-time 720p encoding and then interpolate the missing pixels?)
Tethering to an iPhone is so easy though - just select it in the Wifi menu. I'm not sure if I'd ever pay for an LTE modem option. I'm sure it would be better efficiency and performance to have it built-in, but I wouldn't think many people care enough about that small difference to offer it as an option.
I use the tethering quite often. I have for years. It is flaky and burns two batteries instead of one. I agree that many people do not care. Some of us who are traveling a lot are willing to pay for more options.
It's not about efficiency or performance, it's about not having to own the iPhone in the first place. Just put a SIM card inside the laptop and forget about it. Windows laptops can even seamlessly switch between wifi and LTE depending on which one is available. But of course Apple would never allow that because they want to force you to own the full set of Apple devices. Laptop being self-sufficient would be against their policy.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
I wonder if one of the obstacles is the amount of data that would likely be used.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
Integrating an LTE modem into an iPad requires a mobile data plan, and thats about it. It's not "tethered" if its built into the device.
I've always heard that patent disputes were at the root of the lack of a modem option. Apple had a prototype MacBook Pro back in the early Intel days IIRC but it was never released.
Maybe if Apple ever gets their in-house modems working, we'll see them on all of the product lines, but until then, it's a niche use case that likely isn't causing them to lose a ton of sales.
> It's not "tethered" if its built into the device.
I understand that. My point is that I think an LTE modem in a laptop might reasonably use far more data than an LTE modem in a phone or tablet. Most people who download and/or upload very large files do so on their computer rather than their mobile devices.
There is no reason macOS cannot have some option for throttling usage by background updates when connected over LTE. iPads have an LTE option.
That carriers have not figured out how to charge me by the byte over all my devices instead of per device is really not a big issue to me. I would like to pay for an LTE modem and the necessary bandwidth.
My intuition is that when Apple has their own LTE modem and is not dependent on Qualcomm, a MacBook Pro will have an option similar to that for Dell power users.
The industry as a whole is trying its best to not rely on Qualcomm, given its extremely litigious past. Apple already tried once to avoid using their chips for the iPhone's modem, which I seem to recall failed. When it comes to devices for enterprise, it's less of a gamble because the cost can be passed on to orgs who are less price sensitive.
For normal web dev, any M4 CPU is good as it is mostly dependent on single core speed. If you need to compile Unreal Engine (C++ with lots of threads), video processing or 3D rendering, more cores is important.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
The single most annoying thing about this announcement for me is the fact that I literally just paid for an Asus ProArt P16 [0] on the basis that the Apple offerings I was looking at were too expensive. Argh!
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
Does anyone know if there is a way to use Mac without the Apple bloatware?
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
There is zero iCloud account requirement. You do not need to use the App Store. Gatekeeper can be disabled with a configuration profile key. Telemetry (what little there is) can be disabled with a configuration profile key. Siri can be disabled, all of the generative AI crap can be disabled, yadda yadda yadda, with a configuration profile key. Every background service can be listed and disabled if you disable authenticated-root. Hell, you could disable `apsd` and disable all push notifications too, which require a phone home to Apple.
IIRC Apple is a lot less heavy handed wrt service login requirements when compared to Microsoft’s most recent Windows endeavors. And depending on the developer you can get around having to use the App Store at all. Being you’re on an Intel Mac have you considered just using Linux ?
You can use OSX without an Apple account and paired with a 3rd party host based firewall (Little Snitch), the OS usually stays out of your way (imo). Bundled apps can be removed after disabling SIP (file integrity) but there are downsides/maintenance to that route.
At a linux conference I saw many macbooks. Talked to a few, they just ran linux in a VM full screen for programming and related. Then used OSX for everything else (office, outlook, teams, work enforced apps, etc). They seemed very happy and this encouraged them to not task switch as often.
There used to be this whole contingent of people who were adamant that Apple's software was too opinionated, bloated, that you couldn't adapt its OS to your needs, and that Apple was far too ingrained in your relationship with your device. That Linux was true freedom, but at least that Windows respected its users
I belong to that contingent, and I still stand by the assertion that Apple's software is too opinionated, configurability is unreasonably low, and you have to stick to the Apple ecosystem for many thing to get the most out of it.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
I gave up on macos when they started making the OS partition read-only. A good security feature in general, but their implementation meant that changing anything became a big set of difficulties and trade-offs.
That, combined with the icloud and telemetry BS, I'd had enough.
Not only good security, but it also makes software updates a lot faster because you don't have to check if the user has randomly changed any system files before patching them.
Upgraded to a M1 Pro 14 in December 2021, and I still rock it everyday for dev purpose. Apple does great laptop.
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
> MacBook Air with M2 and M3 comes standard with 16GB of unified memory, and is available in midnight, starlight, silver, and space gray, starting at $999 (U.S.) and $899 (U.S.) for education.
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
As it goes for the section where they demoed the assistance from apple intelligence to the researcher creating an abstract and adding pictures to their paper. Is it better or worse to do this? People are already complaining so heavily about dead internet theory with the 'AI voice' being so prominent..
I'm not sure we can leverage the neural cores for now, but they're already rather good for LLMs, depending on what metrics you value most.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
It is interesting they only support 64gb and then jump to 128gb. It seems like a money play since it's $1,000 to upgrade for 128, and if you're running something that needs more than 64 (like LLMs?) you kind of have no choice.
I have an M2 Max now, and it's incredible. But it still can't handle running xcode's Instruments. I'd upgrade if the M4s could run the leaks tool seamlessly, but I doubt any computer could.
Once they get a MacBook Air with an M4, it will become a viable option for developers and other users that want/need 2 external monitors. Definitely looking forward to that happening.
Poor. My M3 Max/128GB is about 20x slower than 4090. For inference it's much better, still much slower than 4090 but it enables working with much larger LLMs albeit at ~10t/s (in comparison, Threadripper 2990WX/256GB does like 0.25t/s). M4 Max is likely going to be ~25% faster than M3 Max based on CPU perf and memory bandwidth.
When I have a full team of people with 1080p webcams and a solid connection I can notice the quality. Most of the time not everyone fulfills those requirements and the orchestrator system has to make do
I mean you can easily create your own fully meshed P2P group video chat in your browser just using a little bit of JS that would support everyone running 4k, but it will fail the moment you get more than 3-8 people as each persons video stream is eating 25mbps for every side of a peer connection (or 2x per edge in the graph.)
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
4k for videoconferencing is nuts. The new camera should be an improvement over the old. Plus, being able to show your actual, physical desktop can be Andy too. Using your iPhone as the webcam will still probably give you the best quality especially if you are in a lower light situation.
Disingenuous to mention the x86 based MacBooks as a basis for comparison in their benchmarks; they are trying to conflate current-gen Intel with what they shipped more than 4 years ago.
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
Lot's of people don't upgrade on the cadence that users on this forum do. Someone was mentioning yesterday that they are trying to sell their Intel Mac {edit: on this forum] and asking advice on getting the best price. Someone else replied that they still had a 2017 model. I spoke to someone at my job (I'm IT) who told me they'd just ordered a new iMac to replace one that is 11 years old. There's no smoke and mirrors in letting such users know what they're in for.
Yup, I'm a developer who still primarily works on a 2018 Intel Mac. Apple's messaging felt very targeted towards me. Looking forward to getting the M4 Max as soon as possible!
Given that they also compare it to an M1 in the same aside, I'd say you're wrong.
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
Ben Bejarin said that around 50% of the installed base is still using Macs with Intel chips. You’ll keep hearing that comparison until that number goes down.
The base M4 Max only has an option for 36gb of ram!? They're doing some sus things with that pricing ladder again. No more 96gb option, and then to go beyond 48gb I'd have to spend another $1250 CAD on a processor upgrade first, and in doing so lose the option to have the now baseline 512gb ssd
I'd add that although I find it a bit dirty, the computers are obviously still amazing. It's just a bit bizarre that the lower spec cpu offers the customer the option to change the ram quantity. More specifically, going from the M4 Pro to the M4 Max removes the option to change the ram from 36gb, whereas sticking with the Pro lets you select 48gb or 24gb, unless you choose the max Max. If I pre-order the Mac Mini with the same processor, I can select 64gb for the insane price of an additional $750cad, but it's just not available on the macbook pro M4 Pro.
It would indeed have been nice to see a faster response rate screen, even though I value picture quality more, and it also would have been nice to see even vaguely different colors like the iMac supposedly got, but it seems like a nice spec bump year anyway.
I think any idea that Apple doesn't thoroughly understand the capacity, value, market, price tradeoff is untenable.
The most obvious view is that Apple price gouges on storage. But this seems too simplistic.
My conjecture is that there's an inescapable tension between supply (availabilty/cost) sales forecasts, technological churn, and roadmaps that leads them to want to somewhat subsidize the lowest end, and place a bit of back-pressure on consumption at the high-end. The trick is finding the tipping point on the curve between growth and over commitment by suppliers. Especially, for tightly vertically integrated products.
The PC industry is more diffuse and horizontal and so more tolerant of fluctuations in supply and demand across a broader network of providers and consumers, leading to a lower, more even cost structure for components and modules.
In real terms, Apple's products keep costing less, just like all computer products. They seem to make a point of holding prices on an appearance point of latest tech that's held steady since the first Macs: about $2500 for a unit that meets the expectations of space right behind the bleeding edge while being reliable, useful and a vanguard of trends.
Seems plausible enough to me, but whether there's a business case or not isn't my concern as much as how it feels to price something out knowing that I'm deliberately gouged on arbitrary components instead of the the segmentation being somewhat more meaningful. They're already reaping very high margins, but by tightly coupling quantities of those components to even higher margin builds, it feels a bit gross, to the point where I just have to accept that I'd have to spend even more excessively than in previous years of similar models. As in, I'm happy to pay a bit more for more power if I find it useful, likewise with ram, but not being able to get more ram without first getting something I have no way to put to use seems a bit silly, akin to not being able to get better seats in a car unless I first get the racing spec version, otherwise I'm stuck with a lawn chair.
You can just turn it off. macOS lets you change the resolution to use just the screen below the notch, and because it's mini-LED, the now unused "flaps" to the sides of the notch are indistinguishable from the rest of the bezel.
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
My hypothesis is Apple is mostly right about their base model offerings.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
It's just a tactic to get a higher average price while being able to advertise a lower price. What makes it infuriating is memory is dirt cheap. That extra 8GB probably costs them $10 at most, but would add to utility and longevity of their hardware quite a bit.
They are supposed to be "green" but they encourage obsolescence.
They align need with more CPU and margin. Apple wants as few SKUs as possible and as much margin as possible.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
People looking for low prices buy PC, they don't even consider Mac. Then they can have a computer with all the "higher numbers", which is more important than getting stuff done.
I bought a framework back in 2020 or so and really wish I just waited a little longer and spent a few hundred bucks more on the M1.
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
I don't mind spending a thousand bucks every 7 years to upgrade my laptop. I've had this macbook air since 2020 and besides the speakers don't being the best... I have no complaints.
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
This is the first time they have not obscenely limited their starter offerings with 8GB RAM. The time it took them to do that is just pathetic. Now I guess this will happen until how long? Maybe 2034 - and starting RAM 16GB. I wish I could say it's a welcome change but in 2024 for such overpriced machine if they are starting with 16GB RAM then that's anything but special. Also, I am sure the SSDs and RAMs are still soldered tight :)
If only they could allow their iPads to be used as a Mac screen natively I might buy a Mini and an iPad and get done with it two use cases but why would Apple want users to be able to do that without extra expense.
I'm just some dude, looking at a press release, wondering when Tim Apple is gonna be a cool dude and release the MBP in all of the colors that they make the iMac in.
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
I really like these new devices, but I’ve found that the latest MacBook Air (M3) is sufficient for my needs as a manager and casual developer. My MacBook Pro M1 Max has essentially become a desktop due to its support for multiple monitors, but since the Mac Mini M4 Pro can also support up to three external displays, I’m considering selling the MacBook Pro and switching to the Mini. I’ve also noticed that the MacBook Pro’s battery, as a portable device, is less efficient in terms of performance/battery (for my usage) compared to the MacBook Air.
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
I used a Surface Pro for 6 years and and haven’t missed the touch screen once since switching back to MBP 3 years ago. I would have missed the handwriting input but that’s what a low end iPad is for.
Would it make sense to upgrade from M2 Pro 16 to M4 Pro 16? (both base models)
I mean it terms of numbers, more cores, more RAM but everything else is pretty much the same. I am looking forward to see some benchmarks!
Have they published this ahead of other pages or is it just me?
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
I recently switched back to using homemade desktops for most of my work. I’ve been running Debian on them . Still have my Mac laptop for working on the go
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
What's the deal with running Linux on these anyway? Could one conceivably set up an M4 mini as headless server? I presume Metal would be impossible to get working if MacOS uses proprietary drivers for it...
I find it very odd that the new iMac has WiFi 7 but this does not... Also it is so aggravating they compare to 3 generations ago and not the previous generation in the marketing stats. It makes the entire post nearly useless.
It is very aggravating, but if they advertised a comparison to last year's model and showed you small performance gains you might not want to buy it.
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
The software stack has gotten so bad that no amount of hardware can make up for it.
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
Activity Monitor counts everything from I/O buffer caches to frame buffers in the "memory used" metric. Add in that MacOS won't free pages until it hits a certain memory pressure and you get a high usage with no desktop apps open.
This also means that cleanly booted machine with 16 GB will show more memory used than a machine with 8 GB.
Apple suggests you use the memory pressure graph instead to determine whether you're low on memory for this reason.
It's very hard to measure memory use because it's reactive to how much RAM you have; if you have more then it's going to use it. That doesn't necessarily mean there are any issues.
The adjectives in the linked article are nausiating. Apple’s marketing team fail as decent humans writting such drivel.
Give us data, tell us whats new, and skip the nonsense buzz filling adjectives.
To quote Russell Brand, just say he sat down, not that he placed his luscious ass in silk covered trousers on a velvetly smooth chair, experiencing pleasure as the strained thigh muscles received respite after gruelling on their feet watching a lush sunset in a cool summers evening breeze.
I find it amusing how you answer your own "question" before asking it. Why would they target the marketing material at people who already know they aren't going to need to upgrade?
Roopepal did someone piss in your coffee this morning?
I had no questions. I’m merely saying that it’s funny they’re comparing to old technology instead of last year’s.
It’s a valid criticism.
Take a breath.
I really respect Apple's privacy focused engineering. They didn't roll out _any_ AI features until they were capable of running them locally, and before doing any cloud-based AI they designed and rolled out Private Cloud Compute.
You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
Mac OS calls home every time you execute an application. Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU. If you send your computer/phone to Apple for repair you may get back different physical hardware. Those things very much highlight that "your" Apple hardware is not yours and that privacy on Apple hardware does not actually exist, sure they may not share that data with other parties but they definitely do not respect your privacy or act like you own the hardware you purchased. Apple marketing seems to have reached the level indoctrination where everyone just keeps parroting what Apple says as an absolute truth.
They send a hash of the binaries/libraries, and generate a cache locally so it's not sent again. That helps stop you from running tampered-with binaries and frameworks. No user-personal data is sent.
There is no evidence at all that they are trying to ensure you can only run things from the App Store - I run a whole bunch of non-app-store binaries every single day. To make that claim is baseless and makes me de-rate the rest of what you write.
There is always a trade-off between privacy and security. This still falls well under the Google/Android/Chrome level, or indeed the Microsoft/Windows level with its targeted ads, IMHO.
Choose your poison, but this works for me.
> They send a hash
My understanding is that they keep a local file with known malware signatures, just like the malware scanners on every other platform.
> macOS includes built-in antivirus technology called XProtect for the signature-based detection and removal of malware. The system uses YARA signatures, a tool used to conduct signature-based detection of malware, which Apple updates regularly
https://support.apple.com/guide/security/protecting-against-...
Xprotect is a blacklist that runs locally and is rarely used.
The phone home functionality is notarization, where apple does a network call to check that the signature on an executable actually came from apple’s notarization process. It is in essence a reputation system, where developers must be on good terms with apple to have the ability to notarize and get a smooth install experience.
I agree and want to emphasize a few things:
1. Most users are not capable of using general purpose computing technology in a wild, networked environment safely.
2. Too many people who matter to ignore insist, "something must be done."
3. And so something shall be done.
4. Apple is navigating difficult waters. As much as I disapprove of how they have chosen a path for iOS, the fact is many people find those choices are high value.
5. I do, for the most part, approve of their choices for Mac OS. I am not sure how they prevent malicious code without maintaining some sort of information for that purpose.
6. We are arriving at a crossroads many of us have been talking about for a long time. And that means we will have to make some hard choices going forward. And how we all navigate this will impact others in the future for a long time.
Look at Microsoft! They are collecting everything! And they absolutely will work with law enforcement anytime, any day, almost any way!
I sure as hell want nothing to do with Windows 11. Most technical people I know feel the same way.
Screenies every 3 to 5 seconds? Are they high? Good grief! Almost feels like raw rape. Metaphorically, of course.
Then we have Linux. Boy am I glad I took the time way back in the 90's to learn about OSS, Stallman, read words from interesting people, Raymond, Perkins, Searles, Lessig, Doctorow, many others!
Linus did all of tech one hell of a solid and here we are able to literally dumpster dive and build whatever we want just because we can. Awesome sauce in a jar right there
, but!
(And this really matters)
...Linux just is not going to be the general answer for ordinary people. At least not yet. Maybe it will be soon.
It is an answer in the form of a crude check and balance against those in power. Remember the "something shall be done" people? Yeah, those guys.
And here we are back to Apple.
Now, given the context I put here, Apple has ended up really important. Working professionals stand something of a chance choosing Mac OS rather than be forced into Windows 11, transparent edition!
And Apple does not appear willing to work against their users best interests, unless they are both compelled to by law, and have lost important challenges to said law.
If you want that, your choices are Apple and Linux!
7. Open, general purpose computing is under threat. Just watch what happens with Arm PC devices and the locked bootloaders to follow just like mobile devices.
Strangely, I find myself wanting to build a really nice Intel PC while I still can do that and actually own it and stand some basic chance of knowing most of what it doing for me. Or TO ME.
No Joke!
As I move off Win 10, it will be onto Linux and Mac OS. Yeah, hardware costs a bit more, and yeah it needs to be further reverse engineered for Linux to run on it too, but Apple does not appear to get in the way of all that. They also do not need to help and generally don't. Otherwise, the Linux work is getting done by great people we all really should recognize and be thankful for.
That dynamic is OK with me too. It is a sort of harsh mutual respect. Apple gets to be Apple and we all get to be who we are and do what we all do with general purpose computers as originally envisioned long ago.
We all can live pretty easily with that.
So, onward we go! This interesting time will prove to be more dangerous than it needs to be.
If it were not for Apple carving out a clear alternative things would look considerably more draconian, I could and maybe almost should say fascist and to me completely unacceptable.
As someone who cut his teeth on computing in the era you refer to, I have a small disagreement about Linux (especially Ubuntu) in your statement.
Apple is priced beyond the reach of many "ordinary people" especially outside the western markets. A cheap (perhaps after market) laptop with Ubuntu on it (often installed by the seller) is something that has been getting a lot of traction among regular users. Most of the things they do are via. a browser so as long as Chrome/FF works, they're good. They often install software that undermines the security that the platform natively offers but still, it's a pretty decent compromise.
Is it this part?
>Linux just is not going to be the general answer for ordinary people.
It so, I hear you. A decade or more ago, I had Ubuntu running as a general use machine for family and friends use.
It seemed almost there back then, and I saw some success.
Today it would be better, yes? I think so
Fact is, it often takes someone doing support to have it work well, and when that is gone, the software slips behind leaving users to get help.
Today, the numbers are much better. That happens less, but still does happen.
Your point on browser apps is solid. I agree, but those come with their own problems.
I see the most success when I set one up, including Void Tools, many visits to FossHUB...
When done, no network needed and one has a GREAT machine, ready for many tasks!
Both ways have merit and the more the merrier!
I agree with you about Apple hardware, BTW.
Fact is, large numbers of people will just end up on Windows 11 :(
Thank you, this crystallized a lot for me.
It is nice when that happens. Of course, you are welcome.
If you don't mind sharing your take, what firmed up, I would read it with great interest!
> I run a whole bunch of non-app-store binaries every single day
if you are in the US, you need to either register as a developer, or register an apple id and register your app to run it for a week. that's how you run non-app store code. Both of those require permission from apple.
EDIT: Sorry, ios.
This is completely incorrect. You can download a random binary and execute it. You will get a warning dialog saying it’s not signed by a known developer. You are free to ignore that though.
I'm sorry, I was thinking phone in previous comment. Yes, you can run binaries on macos with fiddling (but my comment does apply to ios)
Not ‘with fiddling’ — you can run any software you want on MacOS without altering or adjusting anything.
Depends what you mean by fiddling. But I'm in the process of switching to mac from Linux because my new job has forced it upon me.
I tried installing "Flameshot" via homebrew and it wouldn't run until I went into Finder, right clicked it and clicked open. Luckily it's mentioned in their docs [0] or I would have never guessed to do this.
[0] https://flameshot.org/docs/installation/installation-osx/
That is not the same thing
If I were you, I would relax. At least you are not being shoved onto Win 11.
And then think about that. Seriously. I did. Have a few times off and on over the years as we sink into this mess.
I bet you find an OS that does a bit more than you may otherwise prefer to prevent trouble. If so, fair call in my book.
Just how big of a deal is that?
Compared to Android, Windows 10 and tons of network services and such and what they do not do FOR you, and instead do TO you.
And you can run a respectable and useful installation of Linux on that spiffy Apple hardware when it gets old. So make sure it gets old, know what I mean?
It could all be way worse.
> At least you are not being shoved onto Win 11.
As someone that just got out of a gig where I had to run Docker on MacOS - for the love of god, I would have done almost anything to use Windows 11.
Look - if I'm going to be treated like garbage, advertised to and patronized, at least let me use the system that can run Linux shells without turning into a nuclear reactor.
Lol, nothing is ever easy, is it?
If I did not love computing, I would have bagged on all this long ago.
Nope. A user can just run them if they want to. It is not a big deal.
It’s not “a big deal” if the user knows about, but the phrasing in macOS is maliciously bad - I sent a build from my machine to a coworker and when they “naively” ran it, the pop up that came up didn’t say “this program is unsigned” it said “this program is damaged and will now be deleted” (I don’t remember the exact phrasing but it made it sound like a virus or damaged download, not like an unsigned program).
I don't know about that. Or at least, I won't say they are bad
There are sets of deep roots in play here.
Phrasing struggles are rooted in the differences in these systems, and unless we have spent time in each, struggle seems likely.
That said, I spent time on the Apple side of the computing house early on... I know it helps.
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU.
People have been saying this ever since Apple added the App Store to the Mac in 2010. It’s been 14 years. I wonder how much time has to go by for people to believe it’s not on Apple’s todo list.
> If you send your computer/phone to Apple for repair you may get back different physical hardware.
I happen to be in the midst of a repair with Apple right now. And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone. Aside from the occasional sticker, I don't have any custom hardware mods to my phone or laptop, and nor do 99.99% of people.
Can Apple please every single tech nerd 100% of the time? No. Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Why not both? Why can’t we have a good usability experience AND control? In fact, we used to have that via the Mac hardware and software of the 1990s and 2000s, as well as NeXT’s software and hardware.
There was a time when Apple’s hardware was user-serviceable; I fondly remember my 2006 MacBook, with easily-upgradable RAM and storage. I also remember a time when Mac OS X didn’t have notarization and when the App Store didn’t exist. I would gladly use a patched version of Snow Leopard or even Tiger running on my Framework 13 if this were an option and if a modern web browser were available.
NeXT was great and Mac OS X was also nice and had a lovely indie and boutique app ecosystem during the mid-to-late 2000s. Sadly, iOS stole the focus. However, the OP argues Linux usability is bad, which I think is an outdated POV. It really depends on your setup and usecases. For many development usecases, Linux is superior to macOS.
I run NixOS on a plain X11 environment with a browser, an editor and a terminal. It's really boring. For my favorite development stacks, everything works. Flakes make workflow easy to reproduce, and it's also easy to make dramatic setup changes at OS level thanks to declarativeness and immutability.
If you're interacting with other humans, or with the consumer internet, you'll run into thousands of situations where my default setup (macOS, Chrome) "just works," and your setup will require some extra effort.
You may be smart enough to figure it out, but most people (even many smart tech people) get tired of these constant battles.
Here's an example from earlier this evening: I was buying a plane ticket from Japan Air Lines. Chrome automagically translates their website from Japanese to English. Other browsers, e.g. Firefox, and even Safari, do not - I checked. Is there a workaround or a fix? I'm sure you could find one, given time and effort. But who wants to constantly deal with these hassles?
Another very common example is communication apps. Or any time you're exchanging data in some proprietary format. Would it be great if no one used proprietary formats? Yes! Is that the world we live in? No. Can I force the rest of the world to adopt open standards, by refusing to communicate with them? No.
The world has moved on from desktop environments to multi-device integration like Watch, Phone, AirTags, Speakers, TV and in that way Linux usability is certainly worse than MacOS.
Oh sort of. That is for sure a thing, but not THE thing.
I would argue people are being tugged in that direction more than it being simply better.
You can bet when people start to get to work building things --all sorts of things, not just software, they find out pretty quickly just how important a simple desktop running on a general purpose computer really is!
It could help to compare to other makers for a minute: if you need to repair your Surface Pro, you can easily remove the SSD from the tray, send your machine and stick it back when it comes repaired (new or not)
And most laptops at this point have removable/exchangeable storage. Except for Apple.
> remove the SSD from the tray, send your machine and stick it back when it comes repaired
Apple has full-disk encryption backed by the secure enclave so its not by-passable.
Sure their standard question-set asks you for your password when you submit it for repair.
But you don't have to give it to them. They will happily repair your machine without it because they can boot their hardware-test suite off an external device.
I get your point, but we can also agree "send us your data, we can't access it anyway, right ?" is a completely different proposition from physically removing the data.
In particular if a flaw was to be revealed on the secure enclave or encryption, it would be too late to act on it after the machines have been sent in for years.
To be clear, I'm reacting on the "Apple is privacy focused" part. I wouldn't care if they snoop my bank statements on disk, but as a system I see them as behind what other players are doing in the market.
> if a flaw was ...
I hear the point you're making and I respect the angle, its fair-enough, but ...
The trouble with venturing into what-if territory is the same applies to you...
What if the disk you took out was subjected to an evil-maid attack ?
What if the crypto implementation used on the disk you took out was poor ?
What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
The trouble with IT security is you have you trust someone and something because even with open-source, you're never going to sit and read the code (of the program AND its dependency tree), and even with open-hardware you still need to trust all those parts you bought that were made in China unless you're planning to open your own chip-fab and motherboard plant ?
Its the same with Let's Encrypt certs, every man and his dog are happy to use them these days. But there's still a lot of underlying trust going on there, no ?
So all things considered, if you did a risk-assessment, being able to trust Apple ? Most people would say that's a reasonable assumption ?
> even with open-source, you're never going to sit and read the code (of the program AND its dependency tree)
You don't have to. The fact that it's possible for you to do so, and the fact that there are many other people in the open source community able to do so and share their findings, already makes it much more trust-worthy than any closed apple product.
THIS!
Back when I was new to all of this, the idea of people evaluating their computing environment seemed crazy!
Who does that?
Almost nobody by percentage, but making sure any of us CAN is where the real value is.
Jia Tan has entered the chat.
> What if the disk you took out was subjected to an evil-maid attack ?
Well, have fun with my encrypted data. Then I get my laptop back, and it's either a) running the unmodified, signed and encrypted system I set before or b) obviously tampered with to a comical degree.
> What if the crypto implementation used on the disk you took out was poor ?
I feel like that is 100x more likely to be a concern when you can't control disc cryptography in any meaningful way. The same question applies to literally all encryption schemes ever made, and if feds blow a zero day to crack my laptop that's a victory through attrition in anyone's book.
> What if someone had infiltrated your OS already and been quietly exfiltrating your data over the years ?
What if aliens did it?
Openness is a response to a desire for accountability, not perfect security (because that's foolish to assume from anyone, Apple or otherwise). People promote Linux and BSD-like models not because they cherry-pick every exploit like Microsoft and Apple does but because deliberate backdoors must accept that they are being submit to a hostile environment. Small patches will be scrutinized line-by-line - large patches will be delayed until they are tested and verified by maintainers. Maybe my trust is misplaced in the maintainers, but no serious exploit developer is foolish enough to assume they'll never be found. They are publishing themselves to the world, irrevocably.
What if the disk could be removed, put inside a thunderbolt enclosure, and worked on another machine while waiting for the other? That's what I did with my Framework.
Framework has demonstrated in more than one way that Apple's soldered/glued-in hardware strategy is not necessary.
> Apple has full-disk encryption backed by the secure enclave so its not by-passable.
Any claims about security of apple hardware or software are meaningless. If you actually need a secure device, apple is not an option.
> Any claims about security of apple hardware or software are meaningless. If you actually need a secure device, apple is not an option.
I don't think this is precise, but the constraints seem a bit vague to me. What do you consider to be in the list of secure devices?
Using fully open hardware and software I guess ?
I'm not even here to troll, if you can give details on the list and why that'd be awesome
Seconded
So why the hell do they ask for it then.
> So why the hell do they ask for it then.
I suppose so they can do a boot test post-repair or something like that. I have only used their repair process like twice in my life and both times I've just automatically said "no" and didn't bother asking the question. :)
With Apple FDE, you get nowhere without the password. The boot process doesn't pass go. Which catches people out when they reboot a headless Mac, the password comes before, not after boot even if the GUI experience makes you feel otherwise.
The counterpoint is wiping the device and restoring from local backups when it is returned.
You need to trust the erasure system, which is software. This also requires you to have write access to the disk whatever the issues are, otherwise your trust is left in the encryption and nobody having the key.
That's good enough for most consumers, but a lot more sensitive for enterprises IMHO. It usually gets a pass by having the contractual relation with the repair shop cover the risks, but I know some roles that don't get macbooks for that reason alone.
>And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone.
except that isn't generally how factory repairs are handled.
I don't know about Apple specifically, but other groups (Samsung, Microsoft, Lenovo) will happily swap your unit with a factory refurbished or warranty-repaired unit as long as it was sufficiently qualified before hand -- so the 'replaced with a newer unit' concept might be fantasy.
What makes you think it would be a new one as opposed to a refurbished used one.
If the parts show no signs of wear and tear, what is the difference? Theseus' iPhone.
I've seen a few Rossman streams with officially "refurbished" macbooks that were absolutely foul inside. Boards that looked like they had been left on a preheater over lunch, rubber wedges to "cure" a cracked joint, all sorts of awful shit. The leaked stories from the sweatshop that did the work were 100% consistent with the awful quality.
Admittedly this was a few years ago. Has apple mended their ways or are they still on the "used car salesman" grindset?
It makes me uncomfortable. No particular rational reason, I just don't like it.
Thanks. That is fair. Your truth, and I respect that.
What makes you think it would be a new one as opposed to a refurbished used one.
Because Apple got sued for doing that once, and people including myself are in line to get checks from it.
It would depend on a countries consumer laws. I used to work for AASP's in Australia and they definitely used refurished phones for replacements and refurished parts for the Mac repairs. Not everyone who uses this site lives in America...
It's also the rule in the EU.
> What makes you think it would be a new one
Did I say it would be a "new one"?
Yes, unless this was edited later on.
> 'might replace my aging phone with a newer unit, '
unless you just want to argue about the semantics and differences between 'aging', 'newer' , and 'new'.
You think the difference between "newer than ... aging phone" and "NEW" is "semantics"???
HN really has turned into reddit.
Reddit-style pedantry time!
Semantics is literally the meaning of things. So, yes the difference between those phrases is semantics.
But your use of 'semantics' meant something subtly different. Ain't language weird?
> And for me, the idea that they might replace my aging phone with a newer unit, is a big plus.
It's called a warranty and not at all exclusive to apple whatsoever?
> Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Maybe you should stick to reading and not commenting, if this is the best you can do.
With the sheer number of devs who use Macs, there is a 0% chance they’re going to outright prevent running arbitrary executables. Warn / make difficult, sure, but prevent? No.
The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
Meanwhile you have a minimal set of developers with the ability to run arbitrary programs, and you can go from there with surveillance on MacOS like having every executable tagged with the developer's ID.
The greater the distance between the developer and the user, the more you can charge people to use programs instead of just copying them. But you can go much further under the guise of "quality control".
> The strategy is to funnel most users onto an ipad-like platform
They make the best selling laptop in the world, and other most-popular-in-class laptops. If their strategy is to have people not use laptops, they are going about it funny.
If so, they are executing it badly.
As for every executable being tagged, that is not required. People can build binaries with open tools and other people can run them.
A hash gets created for Apple to play same or different with binaries found to be nefarious somehow. Seems like a reasonable proposition.
> The strategy is to funnel most users onto an ipad-like platform at most where they have basic productivity apps like word or excel but no ability to run general purpose programs.
And you know this how?
This reads like every macOS fan’s worst nightmare, but there’s zero actual evidence that Apple is going in this direction.
Please share sources if you disagree.
> Mac OS calls home every time you execute an application
Consulting a certificate revocation list is a standard security feature, not a privacy issue.
Further, there is a CRL/OCSP cache — which means that if you're running a program frequently, Apple are not receiving a fine-grained log of your executions, just a coarse-grained log of the checks from the cache's TTL timeouts.
Also, a CRL/OCSP check isn't a gating check — i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
Remember that CRLs/OCSP function as blacklists, not whitelists — they don't ask the question "is this certificate still valid?", but rather "has anyone specifically invalidated this certificate?" It is by default assumed that no, nobody has invalidated the certificate.
> i.e. it doesn't "fail safe" by disallowing execution if the check doesn't go through. (If it did, you wouldn't be able to run anything without an internet connection!) Instead, these checks can pass, fail, or error out; and erroring out is the same as passing. (Or rather, technically, erroring out falls back to the last cached verification state, even if it's expired; but if there is no previous verification state — e.g. if it's your first time running third-party app and you're doing so offline — then the fallback-to-the-fallback is allowing the app to run.)
https://www.sentinelone.com/blog/what-happened-to-my-mac-app...
> Last week, just after we covered the release of Big Sur, many macOS users around the world experienced something unprecedented on the platform: a widespread outage of an obscure Apple service caused users worldwide to be unable to launch 3rd party applications.
Scroll down a little further on your link for confirmation of what the parent said:
> As was well-documented over the weekend, trustd employs a “fail-soft” call to Apple’s OCSP service: If the service is unavailable or the device itself is offline, trustd (to put it simply) goes ahead and “trusts” the app.
Even at the time people quickly figured out you could just disconnect from the internet as a workaround until the issue was fixed.
This reply is very informative and should be much more visible given the extent of general ignorance about the "zomg it phones home" feature.
There was a since fixed bug in a prior MacOS release that did fail to launch an app on the local machine if the CRL data was unreachable.
Why is it that non app store apps refuse to run until I explicitly allow it in settings then?
Both Windows and MacOS require that developers digitally sign their software, if you want users to be able to run that software without jumping through additional hoops on their computer.
You can't distribute software through the Apple or Microsoft app stores without the software being signed.
You can sign and distribute software yourself without having anything to do with the app stores of either platform, although getting a signing certificate that Windows will accept is more expensive for the little guys than getting a signing certificate that Macs will accept.
On Windows, allowing users to run your software without jumping through additional hoops requires you to purchase an Extended Validation Code Signing Certificate from a third party. Prices vary, but it's going to be at least several hundred dollars a year.
https://www.reddit.com/r/electronjs/comments/17sizjf/a_guide...
Apple includes a software signing certificate with a basic developer account, which runs $100 a year.
You can ignore that on either platform, but users will have to take additional actions before they can run your unsigned software.
I have literally never experienced that and I use homebrew apps a lot
Perhaps you turned some "make things ultra-secure" setting on at some point ?
That's just your extremely limited experience (2 stores): homebrew runs a special command clearing up a bit so you don't get that notification, which does exist if yout download apps directly
I suspect they're referring to changes to Gatekeeper in recent macOS versions: https://arstechnica.com/gadgets/2024/08/macos-15-sequoia-mak...
It used to be that you could run any third-party application you downloaded. And then for a while you'd have to right-click and select Open the first time you ran an application you'd downloaded, and then click through a confirmation prompt. And macOS 15, you have to attempt to open the application, be told it is unsafe, and then manually approve it via system settings.
Huh? It hashes the binary and phones home doesn’t it? Go compile anything with gcc and watch that it takes one extra second for the first run of that executable. It’s not verifying any certificates
When I first run locally-built software I tend to notice XProtect scanning each binary when it is launched. I know that XProtect matches the executable against a pre-downloaded list of malware signatures rather than sending data to the internet, but I haven't monitored network traffic to be sure it is purely local. You can see the malware signatures it uses at /private/var/protected/xprotect/XProtect.bundle/Contents/Resources/XProtect.yara if you're curious.
> phones home
Nope.
It has a built in malware scanner, but that just requires a downloaded list of known malware signatures.
> not share that data with other parties but they definitely do not respect your privacy
not sharing my data with other parties, or using it to sell me stuff or show me ads, is what I would define as respecting my privacy; Apple checks those boxes where few other tech companies do
Agree. I recently went to an Apple store in Tokyo to buy an accessory. The Apple employee pulled up their store iPhone to take my payment (apple pay) and then asked me to fill out a form with my email address and there was a message about how my info would be shared with some company. I thought about going back and pretending to buy something else so I could film it. I questioned the store person, "It's apple supposed to be "Privacy first"". If it was privacy first they wouldn't have asked for the info in the first place and they certainly wouldn't be sharing it with a 3rd party.
Their repair policy, from what I can see, is a thinly veiled attempt to get you to either pay for Apple Care or to upgrade. I got a quote to repair a colleague's MacBook Pro, less than 2 years old, which has apparent 'water damage' and which they want AUD $2,500 to repair! Of course that makes no sense, so we're buying a new one ...
> to get you to either pay for Apple Care
The problem with many self-repair people is they effectively value their time at zero.
I value my time realistically, i.e. above zero and above minimum wage. It is therefore a no brainer for me to buy AppleCare every ... single ..time. It means I can just drop it off and let someone else deal with messing around.
I also know how much hassle it is. Like many techies, I spent part of my early career repairing people's PCs. Even in big PC tower cases with easy accessibility to all parts its still a fucking horrific waste of time. Hence these days I'm very happy to let some junior at Apple do it for the cost of an AppleCare contract.
> The problem with many self-repair people is they effectively value their time at zero.
Back in 2010 Apple quoted me €700 for a topcase replacement because of shattered display glass. Instead I paid €50 for a third party replacement pane and did 15 minutes of work with a heat gun.
What's more, they fold most of the cost of the repair into the price of parts. So you can either get a replacement screen for €499 and install it yourself, or have it officially repaired for €559. This effectively subsidizes official repairs and makes DIY repairs more expensive.
Apple does extreme gouging with repairs, its hogwash to claim anything else.
A big problem with Apple Care is here in Thailand anyway you need to give them your computer for a few weeks. You have to wait a week for them to look at it. They won't even allow you to use it and then bring it back in a week.
How often do you actually need a repair from Apple? I used to buy AppleCare but stopped in the last few years and have yet to need any repairs done except a battery replacement on a 14 Pro that I was giving to family.
There are three kinds of people
1. people who arguably fall under the definition of careless, or have small children, need repair plans
2. people who are fastidious and nothing ever breaks, don't need repair plans
3. people who are fastidious, have small children, need repair plans
I was a #2 and I'm slowly transitioning into a #3 for specific purchases.
Insurance is always a gamble. Up to you to do your own math on the risks..
That is not why I didn’t buy Apple Care.
My hope is that the machine will work for a long while, like most of them do. In my case it’s a ~$1200 machine so I prefer to self-insure. I’m taking the chance that if it goes bad, I’ll pay to fix or replace it.
This makes sense, for me, when I do it on everything that I buy.
Why not pay for apple care? In the US it covers water damage
Because it feels like extortion. There was almost certainly no water damage caused by external factors: the user didn't spill anything on it and has literally no idea where the so-called water damage could have come from. I have heard anecdotally that this is their go-to for denying claims and it is difficult to argue against.
Humidity in the air can eventually trigger whatever they use to report wet damage.
Yes, that's what I've heard, which seems crazy.
> Apple is well on its way to ensure you can only run things they allow via app store, they would probably already be there if it wasn't for the pesky EU
What has the EU done to stop Apple doing this? Are Apple currently rolling it out to everywhere but the EU?
At the very least Apple are better than Microsoft, Windows and the vendors that sell Windows laptops when it comes to respecting user experience and privacy.
I switched to iPhone after they added the tracker blocking to the OS.
Everything is a tradeoff.
I’d love to live in the F droid alt tech land, but everything really comes down to utility. Messaging my friends is more important than using the right IM protocol.
Much as I wish I could convince everyone I know and have yet to meet to message me on Signal or whatever, that simply isn’t possible. Try explaining that I am not on Whatsapp or insta to a girl I’ve just met…
Also it is nice to spend basically no time maintaining the device, and have everything work together coherently. Time is ever more valuable past a certain point.
But why do we have to choose between convenient and open? Why are these companies allowed to continue having these protected "gardens"? I don't believe a free and truly open ecosystem for mobile devices would actually be less convenient than iOS or Android. If anything it would be vastly better.
They have big numbers. Big numbers tell that 95% of people would need to be in closed protected gardens rather than getting slaughtered by open source wolves.
Has it occurred to you that the stronger control of the ecosystem is a feature that supports the convenience and integration that's possible?
This is just the "Why not Linux desktop" argument from the past two decades. Sure, in theory it can be configured to do a lot of different things. But you're probably gonna have to work out the details yourself because the downside of theoretically supporting everything is that it's impossible to just have it work out of the box with every single scenario.
That's a low bar for girls IMO (not being able to grasp that someone might not want to use Whatsapp or Instagram).
> Apple is well on its way to ensure you can only run things they allow via app store
I don't think Apple's behavior actually reflects this if you look closely (although I can certainly see how someone could form that opinion):
As a counter example, Apple assisted with their own engineers to help port Blender to Metal (https://code.blender.org/2023/01/introducing-the-blender-met...):
> Around one year ago, after joining the Blender Development Fund and seeding hardware to Blender developers, Apple empowered a few of its developers to directly contribute to the Blender source code.
I'm assuming similar support goes to other key pieces of software, e.g., from Adobe, Maxon, etc... but they don't talk about it for obvious reasons.
The point being Apple considers these key applications to their ecosystem, and (in my estimation at least) these are applications that will probably never be included in the App Store. (The counterargument would be the Office Suite, which is in the App Store, but the key Office application, Excel, is a totally different beast than the flagship Windows version, that kind of split isn't possible with the Adobe suite for example.)
Now what I actually think is happening is the following:
1. Apple believes the architecture around security and process management that they developed for iOS is fundamentally superior to the architecture of the Mac. This is debatable, but personally I think it's true as well for every reason, except for what I'll go into in #2 below. E.g., a device like the Vision Pro would be impossible with macOS architecture (too much absolute total complete utter trash is allowed to run unfettered on a Mac for a size-constrained device like that to ever be practical, e.g., all that trash consumes too much battery).
2. The open computing model has been instrumental in driving computing forward. E.g., going back to the Adobe example, After Effects plugins are just dynamically linked right into the After Effects executable. Third party plugins for other categories often work similarly, e.g., check out this absolutely wild video on how you install X-Particles on Cinema 4D (https://insydium.ltd/support-home/manuals/x-particles-video-...).
I'm not sure if anyone on the planet even knows why, deep down, #2 is important, I've never seen anyone write about it. But all the boundary pushing computing fields I'm interested in, which is mainly around media creation (i.e., historically Apple's bread-and-butter), seems to depend on it (notably they are all also local first, i.e., can't really be handled by a cloud service that opens up other architecture options).
So the way I view it is that Apple would love to move macOS to the fundamentally superior architecture model from iOS, but it's just impossible to do so without hindering too many use cases that depend on that open architecture. Apple is willing to go as close to that line as they can (in making the uses cases more difficult, e.g., the X-Particles video above), but not actually willing to cross it.
> Apple is well on its way to ensure you can only run things they allow via app store
I am totally ok with this. I have personally seen apple reject an app update and delist the app because a tiny library used within it had a recent security concerns. Forced the company to fix it.
No one is stopping you from using only the app store if you value its protection, so you need a more relevant justification of forcing everyone else to do so
What about all those libs and executables you likely install via brew, npm, cargo etc? Those are all applications
Sure – Apple are trying to stop people who don't know what they're doing from getting hurt. Hence the strong scrutiny on what is allowed on the App Store (whether it's reasonable to charge 30% of revenue is an entirely different question).
People who are installing things using a terminal are probably (a) slightly computer savvy and (b) therefore aware that this might not be a totally safe operation.
And, despite being an avid homebrew user, I've never had a problem there.
All of us having this discussion are outliers.
The things we talk about here which annoy us are for the much larger set of people who need them!
Put another way, it is all about the set of us who cannot really participate in this discussion.
Even if I have analytics disabled?
Genuinely asking: are there any specifics on this? I understand that blocking at the firewall level is an option, but I recall someone here mentioning an issue where certain local machine rules don’t work effectively. I believe this is the issue [1]. Has it been “fixed”?
[1] https://appleinsider.com/articles/21/01/14/apple-drops-exclu...
They're probably referring to the certificate verification that happens when you open any notarized application. Unless something changed recently, the system phones home to ensure its certificate wasn't revoked.
It doesn't do that on every app launch; there's a cache. It does it on the first launch of a binary from a new team.
(So multiple binaries with the same team don't check either.)
And I'd expect all logging is disabled on the CDN.
I have no reason to expect that it is.
> Even if I have analytics disabled?
Yeah because what’s being sent is not analytics but related to notarizarion, verifying the app’s integrity (aka is it signed by a certificate known to Apple?)
This came to light a few years ago when the server went down and launching apps became impossible to slow…
https://www.macrumors.com/2020/11/12/mac-apps-not-opening/
> where everyone just keeps parroting what Apple says as an absolute truth.
You are free to verify.
> Apple is well on its way to ensure you can only run things they allow via app store,
What are you talking about? I don’t run a single app from the app store and have never felt a need to.
I mean, the security features are pretty well documented. The FBI can't crack a modern iPhone even with Apple's help. A lot of the lockdowns are in service of that.
I'm curious: what hardware and software stack do you use?
Cellebrite Premium 7.69.5 iOS Support Matrix from July 2024.
https://discuss.grapheneos.org/d/14344-cellebrite-premium-ju...
Doesn't AFU here mean the phone had to be already unlocked? Which is most of the entries?
AFU means the phone was unlocked and then relocked.
Right, so not the use case involving the police up thread.
Police do often want to get into phones in that state. This is why Cellebrite sells that product.
FBI and Apple „can't”, but 3rd party do and they do it cheaper every day.
They do not.
Edit: I have not posted a source for this claim, because what sort of source would be acceptable for a claim of the form "X has not occurred"?
If you are going to claim Apple's security model has been compromised, you need not only evidence of such a compromise but also an explanation for why such an "obvious" and "cheap" vulnerability has not been disclosed by any number of white or grey-hat hackers.
Jesus, just post a source.
the burden on proof is not on him to prove a negative
Yes they do.
If you're going to claim that random hacking groups routinely do something the FBI and NSA claim to be unable to do... citation needed.
Ok [1]
"Since then, technologies like Grayshift’s GrayKey—a device capable of breaking into modern iPhones—have become staples in forensic investigations across federal, state, and local levels."
"In other cases where the FBI demanded access to data stored in a locked phone, like the San Bernardino and Pensacola shootings, the FBI unlocked devices without Apple’s help, often by purchasing hacking tools from foreign entities like Cellebrite."
1 - https://www.firstpost.com/tech/the-fbi-was-able-to-hack-into...
An issue with taking their claim at face value is they have no incentive to say they can:
- they can keep asking for backdoors to "stop terrorists"
- they're not on the hook if for whatever reason they can't access a particular phone in a very mediatized case
- most targets (the not so sophisticated ones at least) keep using a device the agencies have proper access to
Regardless of their actual technical means, I don't expect we ever get a "we sure can!" kind of public boasting any time soon.
Is there evidence of this. I’d be interested to know more.
> Apple is well on its way to ensure you can only run things they allow via app store
I'm very happy to only run stuff approved on Apple's app store... ESPECIALLY following their introduction of privacy labels for all apps so you know what shit the developer will try to collect from you without wasting your time downloading it.
Also have you seen the amount of dodgy shit on the more open app stores ?
It's a reasonable choice to do so and you can do it now. The problem starts when Apple forbid it for people who want to install on their computer what they want.
> to purchase a machine that feels like it really belongs to me
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Of course I wish the hardware were somehow more open, but to a large extent, it's directly because of hardware based privacy features.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
One of the most underrated macOS features is the screen sharing app - it’s great for seamless tech support with parents.
It works via your keychain and your contacts, and the recipient gets a little notification to allow you to view their screen.
That’s it - no downloads, no login, no 20 minutes getting a Remote Desktop screen share set up.
As fas as I see, it's not possible to connect to a device that uses the same Apple account, which is what I have done in my case. It has to be a different one.
Also, it only seems to work on a local network with hostnames.
Ahh - not sure about the same account, I’ve not tried that.
It 100% works across the internet: it works with contact names, not just host names.
> to a large extent, it's directly because of hardware based privacy features.
First, this is 100% false. Second, security through obscurity is almost universally discouraged and considered bad practice.
> Second, security through obscurity is almost universally discouraged and considered bad practice.
This is stupid advice that is mindlessly repeated. Security by obscurity only is bad, sure. Adding obscurity to other layers of security is good.
Edit: formatting
This is a common saying but in reality, security through obscurity is widely deployed and often effective.
More pragmatic advice would be to not rely solely on security through obscurity, but rather to practice defence in depth.
Security by insecurity is also 'widely deployed and often effective'.
> I wish the hardware were somehow more open
Some of us are old enough to remember the era of the officially authorised Apple clones in the 90's.
Some of us worked in hardware repair roles at the time.
Some of us remember the sort of shit the third-party vendors used to sell as clones.
Some of us were very happy the day Apple called time on the authorised clone industry.
The tight-knit integration between Apple OS and Apple Hardware is a big part of what makes their platform so good. I'm not saying perfect. I'm just saying if you look at it honestly as someone who's used their kit alongside PCs for many decades, you can see the difference.
> My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
Yeah, but this is hacker news.
“Hacker News” was always the arm of Valley startup mentality, not Slashdot-era Linux enthusiast privacy spook groupthink. It is unfortunate that this change has occurred.
> How true is this when they devices are increasingly hostile to user repair and upgrades?
Not sure what you mean exactly by this, but to me their Self Service Repair program is a step in the right direction.
It was mandated by right to repair laws, it provides the absolute minimum, and they've attempted the price out people wanting to do repairs. The only way it could be more hostile to users is by literally being illegal.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
Apple also left a very convenient hole in their boot loader to allow running another OS. Linux works pretty well these days
* on M1 and M2 variants.
Just to clarify, Asahi Linux is working on M3/M4 support. As far as I can tell nothing changed in the boot loader that makes this work more difficult, it just takes time to add the new hardware.
https://git.sudo.is/mirrors/AsahiLinux-docs/wiki/M3-Series-F...
The bootloader supports third-party operating systems on M3/M4 as well. Linux support just isn't there yet.
* As long as you don't want to use any external displays
Works with devices with HDMI ports
You can buy most parts officially from Apple - I just bought a new set of keycaps to replace some on my MacBook Air. Couldn't do that 5 years ago.
You can install whatever OS you want on your computer - Asahi Linux is the only one that's done the work to support that.
You can disable the system lockdowns that "tighten the screws" you refer to and unlock most things back to how they used to be.
Considering you need an Apple ID to log into the hardware, id argue Apple gatekeeps that ownership pretty tightly.
This isn't true.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
> edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
"You need some cloud-based identity, and this is the best one," even granting its premises, doesn't make being forced into this one a good thing. I'm an Apple user, but there are plenty of people I need to message and share files with who aren't in the Apple ecosystem.
EDIT: As indicated in the reply (written before I added this edit), it sounds like I was ignoring the first part of the post, which pointed out that you aren't forced to use it. I agree that that is a sensible, and even natural and inevitable, reading. I actually wasn't ignoring that part, but I figured the only reason to include this edit was to say "that isn't true, but if it were true, then it would be OK." (Otherwise, what's the point? There's no more complete refutation needed of a false point than that it is false.) My argument is that, if it were true, then that wouldn't be OK, even if you need a cloud-based identity, and even if iCloud is the best one.
doesn't make being forced into this one a good thing
But you're not forced. You completely ignored the other response in order to continue grinding an axe.
It's optional and very easy to skip. Not like the requirement for a MS account on Windows 11, which is also skippable but not by the average user.
I had to set up a Windows computer for the first time in a decade recently, and holy shit did they make it difficult to figure out how to do it without a Microsoft account.
> MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
> Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want.
Try sharing a binary that you built but didn't sign and Notarize and you'll see the problem.
It'll run on the machine that it was built on without a problem, the problems start when you move the binary to another machine.
How true is this when they devices are increasingly hostile to user repair and upgrades?
I can neither repair nor upgrade my electric car, my furniture, or my plumbing. But they all still belong to me.
> I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
You either have have very low standards or very low understanding if you think a completely closed OS on top of completely closed hardware somehow means it 'really belongs' to you, or that your data/privacy is actually being respected.
Whats the alternative? Linux? Maybe OP likes that their OS doesnt crash when they close their laptop lid.
It's not that bad anymore (e.g. with system 76), but I understand the point.
I disagree with OP celebrating Apple to be the least evil of the evils. Yes, there are not many (if any) alternatives, but that doesn't make Apple great. It's just less shitty.
It feels like a lot of people in these threads form their opinions of what desktop Linux is like these days based on one poor experience from back in 2005.
If you're so focused on privacy why don't you just use Linux? With Linux you'll actually get real privacy and you'll really truly own the system.
Apple takes a 30% tax on all applications running on their mobile devices. Just let that sink in. We are so incredibly lucky that never happened to PC.
As much as anyone can say otherwise, running Linux isn’t just a breeze. You will run into issues at some point, you will possibly have to make certain sacrifices regarding software or other choices. Yes it has gotten so much better over the past few years but I want my time spent on my work, not toying with the OS.
Another big selling point of Apple is the hardware. Their hardware and software are integrated so seamlessly. Things just work, and they work well. 99% of the time - there’s always edge cases.
There’s solutions to running Linux distros on some Apple hardware but again you have to make sacrifices.
Even on the machines most well-supported by Linux, which are Intel x86 PCs with only integrated graphics and Intel wifi/bluetooth, there are still issues that need to be tinkered away like getting hardware-accelerated video decoding working in Firefox (important for keeping heat and power consumption down on laptops).
I keep around a Linux laptop and it's improved immensely in the past several years, but the experience still has rough edges to smooth out.
They just have really good marketing. You fell for their pandering. If you really care about privacy use Linux. But Apple ain't it. Closed source and proprietary will never be safe from corporate greed.
>https://archive.ph/Z9z0H
> Private Cloud Compute
That's such a security theater. As long as nobody can look inside their ICs, nobody knows what's really happening there.
They've certainly engaged in a lot of privacy theater before. For example
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
https://www.wired.com/story/apple-differential-privacy-short...
Does that mean you just don't bother encrypting any of your data, and just use unencrypted protocols? Since you can't inspect the ICs that are doing the work, encryption must all also be security theater.
That could be said of any device you own, ever.
That's a fine bit of goalpost shifting. They state that they will make their _entire software stack_ for Private Cloud Compute public for research purposes.
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
The first release set should be downloadable now for inspection. (It's binaries only, source is released for select components)
Actually Apple has stated they are allowing security researchers to look at their infrastructure DIRECTLY.
They haven't done this.
That doesn't mean they get to know what happens inside the ICs.
Looking at a bunch of PCBs doesn't tell you much.
> I really respect Apple's privacy focused engineering.
Everytime you launch an app, Mac OS dials home.
The approach that the big platforms have to producing their own versions of very successful apps cannibalizes their partners. This focus on consumer privacy by Apple is the company's killer competitive advantage in this particular area, IMO. If I felt they were mining me for my private business data I'd switch to Linux in heartbeat. This is what keeps me off Adobe, Microsoft Office, Google's app suite, and apps like Notion as much as possible.
Of late I have been imagining tears of joy rolling down the face of the person who decides to take it upon themself to sing the paeans of Apple Privacy Theatre on a given day. While Apple has been gleefully diluting privacy on their platforms (along with quality and stability of course). They are the masters at selling dystopian control, lock in, and software incompetence as something positive.
It's most dangerous that they own the closed hardware and they own the closed software and then they also get away with being "privacy champions". It's worse than irony.
You hit the nail on the head. And it’s something virtually everyone else replying to you is completely missing.
Apple isn’t perfect. They’re not better at privacy than some absolutist position where you run Tails on RISC V, only connect to services over Tor, host your own email, and run your own NAS.
But of all the consumer focused hardware manufacturers and cloud services companies, they are the only ones even trying.
Nowadays, the only way to have a computer belonging to you is using Linux.
I agree 100% with this.
Amongst all the big tech companies Apple is the closest you will get to if you want Privacy.
Privacy is the new obscenity. What does privacy even mean to you concretely? Answer the question with no additional drama, and I guarantee you either Apple doesn’t deliver what you are asking for, or you are using services from another company, like Google, in a way that the actions speak that you don’t really care about what you are asking for.
End to end encryption by default, such that the cloud provider cannot access my data.
Easy.
It’s honestly not worth engaging with the privacy fundamentalists. They are not arguing in good faith.
Apple doesn’t run open hardware, and supports features users want that involve opening a network connection back home? Hard privacy fail.
> End to end encryption by default, such that the cloud provider cannot access my data.
The App Store stores a lot of sensitive data about you and is not end-to-end encrypted. They operate it just like everyone else. You also use Gmail, which is just as sensitive as your iMessages, and Gmail is not end-to-end encrypted, so it's not clear you value that as much as you say.
I think "could a creepy admin see my nudes" or "can my messages be mined to create a profile of my preferences" are much more practical working definitions of privacy than "can someone see that I've installed an app".
End-to-end encryption is certainly the most relevant feature for these scenarios.
App store DRM is a red herring, as a developer I can still run as much untrusted code on my MBP as I want and I don't see that going away any time soon.
Apple can create a profile of your preferences, for preferences in the sense of things you want to buy. It has your App Store data!
> "can someone see that I've installed an app"
You say preferences and you didn't say what you mean. One meaning of the word preferences: what if you installed Grindr?
You are saying a lot of words but none of them negate the point that Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Sure I use gmail, I've been locked in for 15 years. Someday I'll get fed up enough to bite the bullet and move off it.
> Apple has a better security posture for users than any of the other big tech cos. For any meaningful definition of the word "security."
Apple can push updates and change the rules on your device at any time. Rooted Android works better in that regard: you can still use Google stuff on rooted devices. Also I don't think Apple's security posture for users in China is better than every "other big tech co."
The takeaway for me is that Apple's storytelling is really good. They are doing a good job on taking leadership on a limited set of privacy issues that you can convince busy people to feel strongly about. Whether or not that objectively matters is an open question.
There's some weird[1] laws around privacy in Australia, where government departments are blocked from a bunch of things by law. From my perspective as a citizen, this just results in annoyance such as having to fill out forms over and over to give the government data that they already have.
I heard a good definition from my dad: "Privacy for me is pedestrians walking past my window not seeing me step out of the shower naked, or my neighbours not overhearing our domestic arguments."
Basically, if the nude photos you're taking on your mobile phone can be seen by random people, then you don't have privacy.
Apple encrypts my photos so that the IT guy managing the storage servers can't see them. Samsung is the type of company that includes a screen-capture "feature" in their TVs so that they can profile you for ad-targeting. I guarantee you that they've collected and can see the pictures of naked children in the bathtub from when someone used screen mirroring from their phone to show their relatives pictures of their grandkids. That's not privacy.
Sure, I use Google services, but I don't upload naked kid pictures to anything owned by Alphabet corp, so no problem.
However, I will never buy any Samsung product for any purpose because they laugh and point at customer expectations of privacy.
[1] Actually not that weird. Now that I've worked in government departments, I "get" the need for these regulations. Large organisations are made up of individuals, and both the org and the individual people will abuse their access to data for their own benefit. Many such people will even think they're doing the "right thing" while destroying freedom in the process, like people that keep trying to make voting systems traceable... so that vote buying will become easy again.
The most famous leak of nude images that ever "happened" was from iCloud Photos.
That was the result of social engineering though, not iCloud being compromised. AFAIK it was a phishing scam, asking the victims for their usernames and passwords.
> asking the victims for their usernames and passwords.
This should illuminate for you that there is nothing special about iCloud privacy or security, in any sense. It has the same real weaknesses as any other service that is UIs for normal people.
>I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
Build a desktop PC, yes like a nerdy gamer. ^_^
Install Linux
Been the way for years.
I understand we will be able to disable that just in case? I don't want a Microsoft Windows telemetry dejavu.
The single core performance looks really fast.
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.M4 benchmark source: https://browser.geekbench.com/v6/cpu/8171874
These numbers are misleading (as in not apples to apples comparison). M4 has a matrix multiply hardware extension which can accelerate code written (or compiled) specifically for this extension.
Also: https://news.ycombinator.com/item?id=40339248
So you're saying Apple added something to the design to make it do things faster...which is literally the definition of improvement.
When a company adds a supercharger to a car does it not count as faster?
When I add more solar panels to my roof does it not count as more power?
Surely doing this kind of thing is exactly what we want companies to be doing to make their products faster/better.
I wonder how reliable geekbench tests are. Afaik it's the most common benchmark run on apple devices, so apple has a great interest in making sure their newest chips perform great on the test.
I wouldn't be surprised to hear that the geekbench developers are heavily supported by apple's own performance engineers and that testing might not be as objective or indicative of real world perf as one would hope.
> I feel like Apple kind of buried this in their press release
The press release describes the single core performance as the fastest ever made, full stop:
"The M4 family features phenomenal single-threaded CPU performance with the world’s fastest CPU core"
The same statement is made repeatedly across most the new M4 line up marketing materials. I think thats enough to get the point across that its a pretty quick machine.
Exactly my point. Saying something is the fastest ever is marketing code (at least to me) for minor improvement over the previous generation.
If you're 30% faster than the previous generation, I'd rather see that because my assumption is it's 5%.
Yeah, better than the glaring, 10x better than i7 Intel Mac. Like that's even a valid point of reference.
Not really fair to the absolutely ancient i7 Intel Macs.
6 years ago isn't that old.
For cpus? Yes?
Not really. Intel CPU performance hasn't changed by orders of magnitude in the last ten years. My ten year old Windoze 10 desktop keeps chugging along fine. My newer 2022 i7 Windows machine works similarly well.
However, attention to keeping Intel Macs performant has taken a dive. My 2019 16" MBP died last week so I fell back to my standby 2014 MBP and it's much more responsive. No login jank pause . But it also hasn't been eligible for OS updates for 2 or 3 years.
My new M3 MBP is "screaming fast" with Apple's latest patched OS.
My god, it's ridiculous. I really prefer Linux desktops. They've been snappy for the past 30 years, and don't typically get slow UI's after a year or two of updates.
Absolutely incredible to see Apple pushing performance like this.
I can't wait to buy one and finally be able to open more than 20 Chrome tabs.
I don't know much about modern Geekbench scores, but it that chart seems to show that M1s are still pretty good? It appears that M4 is only about 50% faster. Somehow I would expect more like 100% improvement.
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
Some of it depends on which variant fits you best. But yeah, in general the M1 is still very good--if you hear of someone in your circle selling one for cheap because they're upgrading, nab it.
On the variants: An M1 Max is 10 CPU cores with 8 power and 2 efficiency cores.
M4 Max is 16 cores, 12 + 4. So each power core is 50% faster, but it also has 50% more of them. Add in twice as many efficiency cores, that are also faster for less power, plus more memory bandwidth, and it snowballs together.
One nice pseudo-feature of the M1 is that the thermal design of the current MacBook Pro really hasn't changed since then. It was designed with a few generations of headroom in mind, but that means it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move, while an M3 Max is easier to make (slightly) audible.
> it's very, very hard to make the fans spin on a 16" M1 Max. You have to utilize all CPU/GPU/NPU cores together to even make them move,
I routinely get my M1 fans spinning from compiling big projects. You don’t have to get the GPU involved, but when you do it definitely goes up a notch.
I read so much about the M1 Pros being completely silent that I thought something was wrong with mine at first. Nope, it just turns out that most people don’t use the CPU long enough for the fans to kick in. There’s a decent thermal capacity buffer in the system before they ramp up.
Huh! I regularly max CPU for long stretches (game development), but I found I could only get the fans to move if I engaged the neural cores on top of everything else. Something like a 20+ minute video export that's using all available compute for heavy stabilization or something could do it.
The M3 is much more typical behavior, but I guess it's just dumping more watts into the same thermal mass...
Apple claim up to 1.8x in the press release. They're cherry picking so 50% in a benchmark seems about right.
Appreciate the sanity check.
That's only single core. I think Logic is pretty optimized to use multiple cores (Apple demoed it on the 20 core Xeon Mac Pro back in 2019).
But if the M1 isn't the bottleneck, no reason to upgrade.
Very good to know, thanks.
The M1 was pretty fast when it debuted. If you own an M1 Mac its CPU has not gotten any slower over the years. While newer M-series might be faster, the old one is no slower.
The M1s are likely to remain pretty usable machines for a few years yet, assuming your workload has not or does not significantly change.
No CPU gets slow after any amount of years. That's not how it works. I think what you're trying to say is: Software gets more resource-intensive.
Conversely, the M3 supposedly has better multi core performance? How is that possible?
It doesn't.
It's not really buried... their headline stat is that it's 1.8x faster than the M1, which is actually a bigger improvement than the actual Geekbench score shows (it would be a score of 4354).
Call me cynical, but when I see headlines like "up to 2x faster", I assume it's a cherry-picked result on some workload where they added a dedicated accelerator.
There's a massive difference between "pretty much every app is 80% faster" and "if you render a 4K ProRes video in Final Cut Pro it's 3x faster."
They also explicitly called it out in their announcement videos that the M4 has the fastest CPU cores on the market.
Mx is such a incredible achievement by apple.
Sadly installing Linux is a no unless you use some budu.
Installing Linux is easy...
that's interesting, the scores are accelerating? 9.8% better, 15.7% better, 23.8% better
> "up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro"
I insist my 2020 Macbook M1 was the best purchase I ever made
Yep.
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
> I've never kept any laptop as long as I've kept the M1.
My 2015 MBP would like to have a word.
It’s the only laptop purchase I’ve made. I still use it to this day, though not as regularly.
I will likely get a new MBP one of these days.
You'll be glad you did. I loved my 2015 MBP. I even drove 3 hours to the nearest Best Buy to snag one. That display was glorious. A fantastic machine. I eventually gave it to my sister, who continued using it until a few years ago. The battery was gone, but it still worked great.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
The performance jump between a top-of-the-line intel MBP (I don't remember the year, probably 2019) and the m1 max I got to replace it.. was rather like the perf jump between spinning disks and SSDs.
When I migrated all my laptops to SSDs (lenovos at the time, so it was drop-dead simple), I thought to myself, "this is a once-in-a-generation feeling". I didn't think I would ever be impressed by a laptop's speed ever again. It was nice to be wrong.
> The battery was gone, but it still worked great.
A family 2018 Macbook Air got a second life with a battery replacement. Cheap kit from Amazon, screwdrivers included, extremely easy to do. Still in use, no problems.
My 2015 15" MBP is also still kickin, is/was an absolutely fabulous unit. Was my work machine for 3-4 years, and now another almost-6 years as my personal laptop. My personal use case is obviously not very demanding but it's only now starting to really show its age.
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
The longevity of Macbooks is insanely good.
If we are going this way… I still use a mid-2012 MBP as my main workstation.
Last one with upgrade capabilities, now it has two fast SSDs and maximum Ram. I changed the battery once.
Only shame is that it doesn’t get major MacOS upgrades anymore.
Still good enough to browse the web, do office productivity and web development.
12 years of good use, I am not sure I can get so much value anywhere now
Same setup here except I use Opencore Legacy Patcher so I’m on the latest OS as well. Works amazingly well.
I never questioned the limitations for these upgrades…
Thanks for reminding me that everything is possible, I may try Opencore to keep it even longer !
How’s the performance? I have 2013 MBP with 16GB RAM, and am curious if the newer OS are more RAM hungry.
I was also a mid-2012 MBP user. I eventually got the M2 MBA because I was investing in my eyesight (modern displays are significantly better). I was never impressed with the touchbar-era macs, they didn't appeal to me and their keyboards were terrible.
I think this M-series macbook airs are a worthy successor to the 2012 MBP. I fully intend to use this laptop for at least the same amount of time, ideally more. The lack of replaceable battery will probably be the eventual killer, which is a shame.
That is amazing. Mine lasted for a super long time as well, and like you, I upgraded everything to its max. I think it was the last model with a 17 inch screen.
Sold mine last year for $100 to some dude who claimed to have some software that only runs on that specific laptop. I didn't question it.
I still have my 2015, and it lived just long enough to keep me going until the death of the touch bar and horrible keyboard, which went away when I immediately bought the M1 Pro on release day.
Exactly same story here.
For it's time, the 2015 model was a fantastic package: reliable and robust in form and function.
Would've kept going on it had Apple silicon and 14 inch not come around.
Barring super niche LLM use cases, I don't see why one would need to upgrade.
The 2015 keeps going!
I was considering upgrading to an M3 up until about a month ago when Apple replaced my battery, keyboard, top case, and trackpad completely for free. An upgrade would be nice as it no longer supports the latest MacOS, but at this point, I may just load Ubuntu on the thing and keep using it for another few years. What a machine.
My wife still uses my 2012 MBP 15 retina as her daily driver. The battery's terrible but everything else works fine.
It's extremely easy to replace the battery.
Anything you can buy online ships with all required screw drivers and dozen of Youtube videos or ifixit will give you step by step instructions.
10-15 minutes and you'll have the old battery replaced all by yourself.
It's that simple.
I used that same model for 5 years until I finally upgraded in 2017 and totally regretted it, the upgrade was not worth it at all, I would have been just as happy with the 2012. I quickly replaced it again with the "Mea Culpa" 2019 where they added back in ports, etc, would have been just about worth the upgrade over the 2012, 7 years later, but again, not by a big margin.
The 2012 MBP 15" Retina was probably the only machine I bought where the performance actually got better over the years, as the OS got more optimized for it (the early OS revisions had very slow graphics drivers dealing with the retina display)
The M1 Pro on the other hand, that was a true upgrade. Just a completely different experience to any Apple Intel laptop.
2017 and 2019 had the same ports?
Ah you're right, it was only the keyboard and battery they fixed.
It's been too long, I guess I had blocked out just just how terrible those Intel MacBooks were.
I loved my 2015 MBP, probably the best machine Apple made, overall, until arguably the 2019 16" (read: after the butterfly keyboard debacle)
Traded it for an M1 Air in 2021 and was astonished at how much faster it was. It even blew away my 2019 16" from work.
You're going to be even more blown away!
My 2015 MBP would probably have been totally fine for development... except for the Docker-based workflows that everybody uses now.
Rebuilding a bunch of Docker images on an older intel mac is quite the slow experience if you're doing it multiple times per day.
I've just replaced a 2012 i5 mbp, and used it for Dev work and presentations into 2018.
It has gotten significantly slower the last 2 years, but the more obvious issue is the sound, inability to virtual background, and now lack of software updates.
But if you had told me I'd need to replace it in 2022 I wouldn't believe you
Ah my 2013 mbp died in 2019. It was the gpu. No way to repair it for cheap enough so I had to replace it with a 2019 mbp which was the computer I kept the shortest (I hated the keyboard).
My 2011 MBP died in 2023, it was used daily but very slow at the end of its life.
The 2015 MacBook Pro is the Nokia 3310 of our generation.
I loved my 2015 MBP too.
I recently replaced it with a used MBA M1, 16GB, 2TB.
It's insane how much faster it is, how long the battery lasts and how cool and silent it is. Completely different worlds.
How do you justify this kind of recurring purchases, even with selling your old device? I don't get the behaviour or the driving decision factor past the obvious "I need the latest shiny toy" (I can't find the exact words to describe it, so apologies for the reductive description).
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
It's really very easy, honestly.
My laptop is my work life and my personal life.
I spend easily 100 hours a week using it not-as-balanced-as-it-should-be between the two.
I don't buy them because I need something new, I buy them because in the G4/Intel era, the iterations were massive and even a 20 or 30% increase in speed (which could be memory, CPU, disk -- they all make things faster) results in me being more productive. It's worth it for me to upgrade immediately when apple releases something new, as long as I have issues with my current device and the upgrade is enough of a delta.
M1 -> M2 wasn't much of a delta and my M1 was fine. M1 -> M3 was a decent delta, but, my M1 was still fine. M1 -> M4 is a huge delta (almost double) and my screen is dented to where it's annoying to sit outside and use the laptop (bright sun makes the defect worse), so, I'm upgrading. If I hadn't dented the screen the choice would be /a lot/ harder.
I love ThinkPads too. Really can take a beating and keep on going. The post-IBM era ones are even better in some regards too. I keep one around running Debian for Linux-emergencies.
There are 2 things I was always spending money on, if I felt is not the almost best achievable: my bed and my laptop. Even the phone can be 4 years old iPhone, but the laptop must be best and fast. My sleep is also pretty important. Everything else is just "eco".
In my country you can buy a device and write off in 2 years, VAT reimbursed, then scrap it from the books and you sell it to people without tax payed to people who otherwise would pay a pretty hefty VAT. This decreases your loss of value to like half.
I don't think tax evasion is something one should recommend people do.
Apple has a pretty good trade-in program. If you have an Apple card, it's even better (e.g. the trade-in value is deducted immediately, zero interest, etc.).
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
You’re better off taking it to the Apple Store for trade-in. There are a lot of easy-to-miss reasons the mail-in one might reject it.
Consuming... for some people, is done for it's own sake.
>I've never kept any laptop as long as I've kept the M1
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
My work computer is my personal computer. I easily spend 100+ hours a week using it.
Same here, but I'm still using a mid 2012 Macbook Pro. It's got an SSD and 32GB of ram, but it still works great.
I feel like just running teams on that would make it cry.
I also bet it sounds like a Harrier jet when doing most things, at the temperature of a hot plate.
"I've had this laptop since release in 2020 and I have nearly 0 complaints with it."
Me too. Only one complaint. After I accidentally spilled a cup of water into it on an airplane, it didn't work.
(However AppleCare fixed it for $300 and I had a very recent backup. :) )
If you don’t have AppleCare, it costs $1400+. M2 Pro here that I’m waiting to fix or upgrade because of that.
What’s more annoying is that I’d jus to get a new one and recycle this one, but the SSD is soldered on. Good on you for having a backup.
Do not own a Mac unless you bought it used or have AppleCare.
I've been using portable macs for the last 25 years.
Never had AppleCare or any other extended warranty program.
Did just fine up to now.
Interesting. I have found occasion to use it for pretty much every Mac I've owned since the 1980s! I'm not sure how much money it's saved compared to just paying for repairs when needed, but I suspect it may come out to:
1) a slight overall savings, though I'm not sure about that. 2) a lack of stress when something breaks. Even if there isn't an overall savings, for me it's been worth it because of that.
Certainly, my recent Mac repair would have cost $1500 and I only paid $300, and I think I've had the machine for about 3 years, so there's a savings there but considerably less recent stress. That's similar to the experience I've had all along, although this recent expense would have probably been my most-expensive repair ever.
"SSD is soldered on" is a bit of glossing over of the issue with the M-series Macs.
Apple is putting raw NAND chips on the board (and yes soldering them) and the controller for the SSD is part of the M-series chip. Yes, apple could use NVMe here if you ignore the physical constraints and ignore fact that it wouldn't be quite as fast and ignore the fact that it would increase their BOM cost.
I'm not saying Apple is definitively correct here, but, it's good to have choice and Apple is the only company with this kind of deeply integrated design. If you want a fully modular laptop, go buy a framework (they are great too!) and if you want something semi-modular, go buy a ThinkPad (also great!).
The problem is those other options won't run macOS. If the OS is a given then there's no choice.
Day to day I don't mind but when needs change or something breaks it's unfortunate to have to replace the whole machine to fix it.
Yeah, I always have AppleCare. I view it as part of the cost of a mac (or iPhone).
And yeah, this incident reminded me of why it's important to back up as close to daily as you can, or even more often during periods when you're doing important work and want to be sure you have the intermediate steps.
My 2019 i9 going strong as ever. With 64gb ram, really don’t need to upgrade for at least a couple more years.
I had the 2019 i9. The power difference and the cooling difference is astounding from the 2019 to the M1 (and the M1 is faster).
I actually use my laptop on my lap commonly and I think the i9 was going to sterilize me.
I had an 2019 i9 for a work laptop. It was absolutely awful, especially with the corporate anti-virus / spyware on it that brought it to a crawl. Fans would run constantly. Any sort of Node JS build would make it sound like a jet engine.
yeah I had the 8-core i9 and I was shocked at how much faster my M1 Air was when I got it. No fan and still acing the old MBP!
Now on M2 MBP and will probably be using it for a very long time.
I have the OG 13" MBP M1, and it's been great; I only have two real reasons I'm considering jumping to the 14" MBP M4 Pro finally:
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
The M1 Max supports more than one external display natively, which is also an option.
Yep. That's roughly 20% per generation improvement which ain't half-bad these days, but the really huge cliff was going from Intel to the M1 generation.
M1 series machines are going to be fine for years to come.
It feels like M1 was the revolution, subsequent ones evolution - smaller fabrication process for improved energy efficiency, more cores for more power, higher memory (storage?) bandwidth, more displays (that was a major and valid criticism for the M1 even though in practice >1 external screens is a relatively rare use case for <5% of users).
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
It's a very robust and capable small laptop. I'm typing this to a M1 Macbook Air.
The only thing to keep in mind, is that the M1 was the first CPU in the transition from Intel CPUs (+ AMD GPUs) to Apple Silicon. The M1 was still missing a bunch of things from earlier CPUs, which Apple over time added via the M1 Pro and other CPUs. Especially the graphics part was sufficient for a small laptop, but not for much beyond. Better GPUs and media engines were developed later. Today, the M3 in a Macbook Air or the M4 in the Macbook Pro have all of that.
For me the biggest surprise was how well the M1 Macbook Air actually worked. Apple did an outstanding job in the software & hardware transition.
Yep. Bought an M1 Max in 2021 and it still feels fast, battery lasts forever. I’m sure the M4 would be even quicker (Lightroom, for example) but there’s little reason to consider an upgrade any time soon.
I still use my MacBook Air M1 and given my current workloads (a bit of web development, general home office use and occasional video editing and encoding) I doubt I’ll need to replace it in the coming 5 years. That’ll be an almost 10 year lifespan.
And my 2020 Intel Macbook Air was a bad purchase. Cruelly, the Intel and M1 Macbook Air released within 6 months of each other.
In early 2020, I had an aging 2011 Air that was still struggling after a battery replacement. Even though I "knew" the Apple Silicon chips would be better, I figured a 2020 Intel Air would last me a long time anyway, since my computing needs from that device are light, and who knows how many years the Apple Silicon transition will take take anyway?
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
Exact same boat here. A friend and I both bought the 2020 Intel MBA thinking that the M1 version was at least a year out. It dropped a few months later. I immediately resold my Intel MBA seeing the writing on the wall and bought a launch M1 (which I still use to this day). Ended up losing $200 on that mis-step, but no way the Intel version would still get me through the day.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
When I saw the M1s come out, I thought that dev tooling would take a while to work for M1, which was correct. It probably took a year for most everything to be compiled for arm64. However I had too little faith in Rosetta and just the speed upgrade M1 really brought. So what I mean to say is, I still have that deadweight MBA that I only use for web browsing :)
Oh yes, my wife bought a new Intel MBA in summer 2020... I told her at the time Apple planned its own chip, but it couldn't be much better than the Intel one and surely Apple will increase prices too... I was so wrong.
Yeah I’m in the same boat. I had my old mid 2013 Air for 7 years before I pulled the trigger on that too. I’ll be grabbing myself an M4 Pro this time
Amen. I got a crazy deal on a brand new 2020 M1 Max MBP with 64GB/2TB in 2023.
This is the best machine I have ever owned. It is so completely perfect in every way. I can't imagine replacing it for many many years.
Congratulations, just curious what is the deal?
At the end of 2023 BH Photo Video was selling the M1 Max 16” 64G/2TB for 2,499. It’s the lowest I’ve ever seen it anywhere and I got one myself.
Thanks! Looks like obscure retailers sometimes have good deals.
I switched from a 2014 MacBook pro to a 2020 M1 MacBook Air, yeah the CPU is much faster, but the build quality and software is a huge step backwards. The trackpad is feels fake, not nearly as responsive, keyboard also feel not as solid. But now I'm already used to it.
Agree, even without whisky (this whisky: https://getwhisky.app).
With whisky i feel like id never need anything else. That said, the benchmark jump in the m4 has me thinking i should save up and grab a refurb in a year or two
I don't like my M1. It's really good for using Lightroom at the coffee shop, but absolutely sucks for developing software
M1 Pro compared to Intel was so big step ahead that I suppose we all are still surprised and excited. Quiet, long battery life and better performance. By a lot! I wonder if M4 really feels that much faster and better - having M1 Pro I'm not going to change quickly, but maybe Mac Mini will land some day.
Honestly it was a game changer. Before I'd never leave the house without a charger, nowadays I rarely bring it with me on office days, even with JS / front-end workloads.
(of course, everyone else has a macbook too, there's always someone that can lend me a charger. Bonus points that the newer macbooks support both magsafe and USB-C charging. Added bonus points that they brought back magsafe and HDMI ports)
There are so many ports that would be far more useful than MagSafe at a similar hardware cost
Yes, I got mine for 900 Euros (16, 256). Still working perfectly. What a bargain that was.
reading this for my late 2013 MBP. It is so old that I can't install the latest of Darktable on it.
Same. My MBP and M1 Air are amazing machines. But I’m now also excited that any future M chip replacement will be faster and just as nice.
The battery performance is incredible too.
I have the same one, but everyone I know with an M series Mac says the same thing. These are the first machines in a long time built to not only last a decade but be used for it.
It's annoyingly good! I want to upgrade, but especially having splurged on 64Gb RAM, I have very little justifiable reason.
Except for the usb c charge port - magcharge was the best invention and I’ll never understand why it was removed.
I have the M1 MacBook Pro with MagSafe and I still charge it via USB C simply because I can't be bothered to carry around another cable when all of my other peripherals are USB C.
I have the same. I would much rather have a different port, like USB-A, than something I have never used
It's the other way around, isn't it? MagSafe was removed in the 2016-2019 model years (not sure why; maybe to shave off another bit of thickness?), and then brought back in 2020 to MacBook Pro and 2022 to MacBook Air.
Personally, I practically never use MagSafe, because the convenience of USB C charging cables all over the house outweighs the advantages of MagSafe for me.
Pro tip, USB c magnetic adapter is cheap and works well enough
I got a refurbed M1 iPad Pro 12.9” for $900 a couple years ago and have been quite pleased. I still have a couple of years life in it I estimate.
The M4 Max goes up to 128GB RAM, and "over half a terabyte per second of unified memory bandwidth" - LLM users rejoice.
The M3 Max was 400GBps, this is 540GBps. Truly an outstanding case for unified memory. DDR5 doesn't come anywhere near.
Apple is using LPDDR5 for M3. The bandwidth doesn't come from unified memory - it comes from using many channels. You could get the same bandwidth or more with normal DDR5 modules if you could use 8 or more channels, but in the PC space you don't usually see more than 2 or 4 channels (only common for servers).
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
> (only common for servers).
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
> In fact, every computer without a discrete GPU uses a unified memory model these days!
On PCs some other hardware (notably the SSD) comes with its own memory. But here it's shared with the main DRAM too.
This is not necessarily a performance improvement, it can avoid copies but also means less is available to the CPU.
DRAM-less NVMe (utilizing HMB) is also common on PCs, but it's seen as a slower budget alternative rather than a good thing.
I read all that marketing stuff and my brain just sees APU. I guess at some level, that’s just marketing stuff too, but it’s not a new idea.
The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide. Normal CPU cores running normal codes are not particularly bandwidth limited. However APUs/iGPUs are severely bandwidth limited, thus the huge number of slow iGPUs that are fine for browsing but terrible for anything more intensive.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
> The new idea is having 512 bit wide memory instead of PC limitation of 128 bit wide.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
Yeah, a lot of it is just market forces. I guess going to four channels is costly for the desktop PC space and that's why that didn't happen, and laptops just kind of followed suite. But now that Apple is putting pressure on the market, perhaps we'll finally see quad channel becoming the norm in desktop PCs? Would be nice...
> instead of PC limitation of 128 bit wide
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
> Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
FYI cache lines are 64 bytes, not bits. So Apple is using 128 bytes.
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
Ugh, should have caught the bit vs byte, thanks.
The M4 max is definitely not a 4090 killer, does not match it in any way. It can however work on larger models than the 4090 and have a battery that can last all day.
My memory is a bit fuzzy, but I believe the m3 max did decent on some games compared to the laptop Nvidia 4070 (which is not the same as the desktop 4070). But highly depended on if the game was x86-64 (requiring emulation) and if it was DX11 or apple native. I believe apple claims improvements in metal (the Apple's GPU lib) and that the m4 GPUs have better FP for ray tracing, but no significant changes in rasterized performance.
I look forward to the 3rd party benchmarks for LLM and gaming on the m4 max.
What I was trying to say is that there is no 128b limitation for PCs.
Eh… not quite. Maybe on an Instinct. Unified memory means the CPU and CPU means they can do zero copy to use the same memory buffer.
Many integrated graphics segregate the memory into CPU owned and GPU owned, so that even if data is on the same DIMM, a copy still needs to be performed for one side to use what the other side already has.
This means that the drivers, etc, all have to understand the unified memory model, etc. it’s not just hardware sharing DIMMs.
I was under the impression PS4’s APU implemented unified memory, and it was even referred to by that name[1].
APUs with shared everything are not a new concept, they are actually older than programmable graphics coprocessors…
https://www.heise.de/news/Gamescom-Playstation-4-bietet-Unif...
I believe that at least on Linux you get zero-copy these days. https://www.phoronix.com/news/AMD-AOMP-19.0-2-Compiler
High end servers now have 12 ddr5 channels.
Yes, you could buy a brand new (announced weeks ago) AMD Turin. 12 channels of DDR5-6000, $11,048 and 320 watts (for the CPU) and get 576GB/sec peak.
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
AnandTech did bandwidth benchmarks for the M1 Max and was only able to utilize about half of it from the CPU, and the GPU used even less in 3D workloads because it wasn't bandwidth limited. It's not all about bandwidth. https://www.anandtech.com/show/17024/apple-m1-max-performanc...
Indeed. RIP Anandtech. I've seen bandwidth tests since then that showed similar for newer generations, but not the m4. Not sure if the common LLM tools on mac can use CPU (vector instructions), AMX, and Neural engine in parallel to make use of the full bandwidth.
You lose out on things like expandability (more storage, more PCIe lanes) and repairability though. You are also (on M4 for probably a few years) compelled to use macOS, for better or worse.
There are, in my experience, professionals who want to use the best tools someone else builds for them, and professionals who want to keep iterating on their tools to make them the best they can be. It's the difference between, say, a violin and a Eurorack. Neither's better or worse, they're just different kinds of tools.
Agreed.
I was sorely tempted by the Mac studio, but ended up with a 96GB ram Ryzen 7900 (12 core) + Radeon 7800 XT (16GB vram). It was a fraction of the price and easy to add storage. The Mac M2 studio was tempting, but wasn't refreshed for the M3 generation. It really bothered me that the storage was A) expensive, B) proprietary, C) tightly controlled, and D) you can't boot without internal storage.
Even moving storage between Apple studios can be iffy. Would I be able to replace the storage if it died in 5 years? Or expand it?
As tempting as the size, efficiency, and bandwidth were I just couldn't justify top $ without knowing how long it would be useful. Sad they just didn't add two NVMe ports or make some kind of raw storage (NVMe flash, but without the smarts).
> Even moving storage between Apple studios can be iffy.
This was really driven home to me by my recent purchase of an Optane 905p, a drive that is both very fast and has an MTBF measured in the hundreds of years. Short of a power surge or (in California) an earthquake, it's not going to die in my lifetime -- why should I not keep using it for a long time?
Many kinds of professionals are completely fine with having their Optanes and what not only be plugged in externally, though, even though it may mean their boot drive will likely die at some point. That's completely okay I think.
I doubt you'll get 10+ hours on battery if you utilize it at max. I don't even know if it can really sustain the maximum load for more than a couple of minutes because of thermal or some other limits.
FWIW I ran a quick test of gemma.cpp on M3 Pro with 8 threads. Similar PaliGemma inference speed to an older AMD (Rome or Milan) with 8 threads. But the AMD has more cores than that, and more headroom :)
CXL memory is also a thing.
Yes, it's just easier to call it that without having to sprinkle asterisks at each mention of it :)
And yes, the impressive part is that this kind of bandwidth is hard to get on laptops. I suppose I should have been a bit more specific in my remark.
Yeah memory bandwidth is one of the really unfortunate things about the consumer stuff. Even the 9950x/7950x, which are comfortably workstation-level in terms of compute, are bound by their 2 channel limits. The other day I was pricing out a basic Threadripper setup with a 7960x (not just for this reason but also for more PCIe lanes), and it would cost around $3000 -- somewhat out of my budget.
This is one of the reasons the "3D vcache" stuff with the giant L3 cache is so effective.
Isn't unified memory* a crucial part in avoiding signal integrity problems?
Servers do have many channels but they run relatively slower memory
* Specifically, it being on-die
"Unified memory" doesn't really imply anything about the memory being located on-package, just that it's a shared pool that the CPU, GPU, etc. all have fast access to.
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
System memory DRAM never is, but sometimes DRAM is technically included on CPU dies as a cache
https://en.wikipedia.org/wiki/EDRAM
It's mostly an IBM thing. In the consumer space, it's been in game consoles with IBM-fabbed chips. Intel's use of eDRAM was on a separate die (there was a lot that was odd about those parts).
For comparison, a Threadripper Pro 5000 workstation with 8x DDR4 3200 has 204.8GB/s of memory bandwidth. The Threadripper Pro 7000 with DDR5-5200 can achieve 325GB/s.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
> And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
GB/s and GBps mean the same thing, though GB/s is the more common way to express it. Gb/s and Gbps are the units that are 8x less: bits vs Bytes.
Thanks for the numbers. Someone here on hackernews got me convinced that a Threadripper would be a better investment for inference than a MacBook Pro with a M3 Max.
B = Bytes, b = bits.
GB/s is the same thing as GBps
The "ps" means "per second"
I was curious so I looked it up:
https://en.wikipedia.org/wiki/DDR5_SDRAM (info from the first section):
> DDR5 is capable of 8GT/s which translates to 64 GB/s (8 gigatransfers/second * 64-bit width / 8 bits/byte = 64 GB/s) of bandwidth per DIMM.
So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
DDR4 clocks in at 3.2GT/s and the fastest DDR3 at 2.1GT/s.
DDR5 is an impressive jump. HBM is totally bonkers at 128GB/s per DIMM (HBM is the memory used in the top end Nvidia datacenter cards).
Cheers.
> So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
Thanks for your reply. Are you up for updating the Wikipedia page?, because as of now the canonical reference is wrong.
If you're referring to the line you quoted, then no, it's not wrong. Each DIMM is perfectly capable of 64GiB/s, just as the article says. Where it might be confusing is that this article seems to only be concerning itself with the DIMM itself and not with the memory controller on the other end. As the other reply said, the actual bandwidth available also depends on the number of memory channels provided by the CPU, where each channel provides one DIMM worth of bandwidth.
This means that in practice, consumer x86 CPUs have only 128GiB/s of DDR5 memory bandwidth available (regardless of the number of DIMM slots in the system), because the vast majority of them only offer two memory channels. Server CPUs can offer 4, 8, 12, or even more channels, but you can't just install 16 DIMMs and expect to get 1024GiB/s of bandwidth, unless you've verified that your CPU has 16 memory channels.
Got it, thanks for clarifying.
Happy Halloween!
Yes, and wouldn’t it be bonkers if the M4 Max supported HBM on desktops?
It's not the memory being unified that makes it fast, it's the combination of the memory bus being extremely wide and the memory being extremely close to the processor. It's the same principle that discrete GPUs or server CPUs with onboard HBM memory use to make their non-unified memory go ultra fast.
I thought “unified memory” was just a marketing term for the memory being extremely close to the processor?
No, unified memory usually means the CPU and GPU (and miscellaneous things like the NPU) all use the same physical pool of RAM and moving data between them is essentially zero-cost. That's in contrast to the usual PC setup where the CPU has its own pool of RAM, which is unified with the iGPU if it has one, but the discrete GPU has its own independent pool of VRAM and moving data between the two pools is a relatively slow operation.
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
I don't quite understand one of the finer points of this, under caffeinated :) - if GPU memory is extremely close to the CPU memory, what sort of memory would not be extremely close to the CPU?
I think you misunderstood what I meant by "processor", the memory on a discrete GPU is very close to the GPUs processor die, but very far away from the CPU. The GPU may be able to read and write its own memory at 1TB/sec but the CPU trying to read or write that same memory will be limited by the PCIe bus, which is glacially slow by comparison, usually somewhere around 16-32GB/sec.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
I see, TL;DR == none; and processor switches from {CPU,GPU} to {GPU} in the 2nd paragraph. Thanks!
I thought it meant that both the GPU and the CPU can access it. In most systems, GPU memory cannot be accessed by the CPU (without going through the GPU); and vice versa.
CPUs access GPU memory via MMIO (though usually only a small portion), and GPUs can in principle access main memory via DMA. Meaning, both can share an address space and access each other’s memory. However, that wouldn’t be called Unified Memory, because it’s still mediated by an external bus (PCIe) and thus relatively slower.
Are they cache coherent these days? I feel like any unified memories should be.
It's not "DDR5" on its own, it's a few factors.
Bandwidth (GB/s) = (Data Rate (MT/s) * Channel Width (bits) * Number of Channels) / 8 / 1000
(8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
> (8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
M4 max in the MBP (today) and in the Studio at some later date.
That seems unlikely given the mismatched memory speed (see the parent comment) and the fact that Apple uses LPDDR which is typically 16 bits per channel. 8800MT/s seems to be a number pulled out of thin air or bad arithmetic.
Heh, ok, maybe slightly different. But apple spec claims 546GB/sec which works out to 512 bits (64 bytes) * 8533. I didn't think the point was 8533 vs 8800.
I believe I saw somewhere that the actual chips used are LPDDR5X-8533.
Effectively the parents formula describes the M4 max, give or take 5%.
Fewer libraries? Any that a normal LLM user would care about? Pytorch, ollama, and others seem to have the normal use cases covered. Whenever I hear about a new LLM seems like the next post is some mac user reporting the token/sec. Often about 5 tokens/sec for 70B models which seems reasonable for a single user.
Is there a normal LLM user yet? Most people would want their options to be as wide as possible. The big ones usually get covered (eventually), and there are distinct good libraries emerging for Mac only (sigh), but last I checked the experience of running every kit (stable diffusion, server-class, etc) involved overhead for the Mac world.
> This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
Right, the nvidia card maxes out at 24GB.
A 24gb model is fast and ranks 3. A 70b model is slow and 8.
A top tier hosted model is fast and 100.
Past what specialized models can do, it's about a mixture/agentic approach and next level, nuclear power scale. Having a computer with lots of relatively fast RAM is not magic.
Thanks, but just to put things into perspective, this calculation has counted 8 channels which is 4 DIMMs and that's mostly desktops (not dismissing desktops, just highlighting that it's a different beast).
Most laptops will be 2 DIMMS (probably soldered).
Desktops are two channels of 64 bits, or with DDR5 now four (sub)channels of 32 bits; either way, mainstream desktop platforms have had a total bus width of 128 bits for decades. 8x64 bit channels is only available from server platforms. (Some high-end GPUs have used 512-bit bus widths, and Apple's Max level of processors, but those are with memory types where the individual channels are typically 16 bits.)
I think you are confusing channels and dimms.
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
Is it GBps or Gbps?
GB per second
This is a case for on-package memory, not for unified memory... Laptops have had unified memory forever
EDIT: wtf what's so bad about this comment that it deserves being downvoted so much
Intel typically calls their iGPU architecture "shared memory"
Hm it seems like they call it unified memory too, at least in some places, have a look at 5.7.1 "Unified Memory Architecture" in this document: https://www.intel.com/content/dam/develop/external/us/en/doc...
It more or less seems like they use "unified memory" and "shared memory" interchangeably in that sectionI think "Unified" vs "shared" is just something Apple marketing department came up with.
Calling something "shared" makes you think: "there's not enough of it, so it has to be shared".
Calling something "unified" makes you think: "they are good engineers, they managed to unify two previously separate things, for my benefit".
I don't think so? That PDF I linked is from 2015, way before Apple put focus on it through their M-series chips... And the Wikipedia article on "Glossary of computer graphics" has had an entry for unified memory since 2016: https://en.wikipedia.org/w/index.php?title=Glossary_of_compu...
For Apple to have come up with using the term "unified memory" to describe this kind of architecture, they would've needed to come up with it at least before 2016, meaning A9 chip or earlier. I have paid some attention to Apple's SoC launches through the years and can't recall them touting it as a feature in marketing materials before the M1. Do you have something which shows them using the term before 2016?
To be clear, it wouldn't surprise me if it has been used by others before Intel did in 2015 as well, but it's a starting point: if Apple hasn't used the term before then, we know for sure that they didn't come up with it, while if Apple did use it to describe A9 or earlier, we'll have to go digging for older documents to determine whether Apple came up with it
There are actual differences but they're mostly up to the drivers. "Shared" memory typically means it's the same DRAM but part of it is carved out and can only be used by the GPU. "Unified" means the GPU/CPU can freely allocate individual pages as needed.
I'm curious about getting one of these to run LLM models locally, but I don't understand the cost benefit very well. Even 128GB can't run, like, a state of the art Claude 3.5 or GPT 4o model right? Conversely, even 16GB can (I think?) run a smaller, quantized Llama model. What's the sweet spot for running a capable model locally (and likely future local-scale models)?
You'll be able to run 72B models w/ large context, lightly quantized with decent'ish performance, like 20-25 tok/sec. The best of the bunch are maybe 90% of a Claude 3.5.
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
Lots of discussion and testing of that over on https://www.reddit.com/r/LocalLLaMA/, worth following if you're not already.
Claude 3.5 and GPT 4o are huge models. They don't run on consumer hardware.
We run our LLM workloads on a M2 Ultra because of this. 2x the VRAM; one-time cost at $5350 was the same as, at the time, 1 month of 80GB VRAM GPU in GCP. Works well for us.
Can you elaborate, are those workflows in queue or can they serve multiple users in parallel ?
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources. Thanks !
Our use case is atypical, based on what others seem to require. While we serve multiple requests in parallel, our workloads are not 'chat'.
If the 2x multiplier holds up, the Ultra update should bring it up to 1080GBps. Amazing.
There isn't even an M3 Ultra. Will there be an M4 Ultra?
At some point there should be an upgrade to the M2 Ultra. It might be an M4 Ultra, it might be this year or next year. It might even be after the M5 comes out. Or it could be skipped in favour of the M5 Ultra. If anyone here knows they are definitely under NDA.
M3 was built on an expensive process node, I don’t think it was ever meant to be around long.
That would make the most sense for the next Mac Studio version.
There were rumors that the next Mac Studio will top out at 512Gb RAM, too.
Good news for anyone who wants to run 405B LMs locally...
And the week isn't over...
They announced earlier in the week that there will only be three days of announcements
comparing a laptop to a A100 (312 teraFLOPS) or H100 (~1P FLOPS) server is a stretch to say the least.
An M2 is according to a reddit post around 27 tflops
So < 1/10 the performance of just computation. let alone the memory.
What workflow would use something like this?
They aren't going to be using fp32 for inferencing, so those FP numbers are meaningless.
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
> comparing a laptop
Small correction: the M2 Ultra isn't found in laptops, its in the Studio.
About 10-20% of my companies gpu usage is inference dev. Yes horribly not efficient usage of resources. We could upgrade the 100ish devs who do this dev work to M4 mbp and free up gpu resources
Smart move by Apple
Right now, there are 0.90$ per hour H100 80gbs that you can rent.
You have another one with a network gateway to provide hot failover?
Right?
High availability story for AI workloads will be a problem for another decade. From what I can see the current pressing problem is to get stuff working quickly and iterate quickly.
For context, the 4090 has 1,008 GB/s of bandwidth.
... but only 1/4 of the actual memory, right ?
The M4-Max I just ordered comes with 128GB of RAM.
I have M3 Max with 128GB of ram, it's really liberating.
I have 32gb and I've never felt like I needed more.
Having 128GB is really nice if you want to regularly run different full OSes as VMs simultaneously (and if those OSes might in turn have memory-intensive workloads running on them).
Somewhat niche case, I know.
No one needs more than 640kB.
Obviously you're not a golfer.
https://www.youtube.com/watch?v=YzhKEHDR_rc :-) Thanks for that, I think I will watch The Big Lebowski tonight!
Far out, man
:P
At least in the recent past, a hindrance was that MacOS limited how much of that unified memory could be assigned as VRAM. Those who wanted to exceed the limits had to tinker with kernel settings.
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
I am always wondering if one shouldn't be doing the resource intensive LLM stuff in the cloud. I don't know enough to know the advantages of doing it locally.
you'd probably save money just paying for a VPS. And you wouldn't cook your personal laptop as fast. Not that people nowadays keep their electronics for long enough for that to matter :/
Well it's more like pick your poison, cause all options have caveats:
- Apple: all the capacity and bandwidth, but no compute to utilize it
- AMD/Nvidia: all the compute and bandwidth, but no capacity to load anything
- DDR5: all the capacity, but no compute or bandwidth (cheap tho)
Why was this downvoted?
To quote an old meme, "They hated Jesus because he told them the truth."
This is definitely tempting me to upgrade my M1 macbook pro. I think I have 400GB/s of memory bandwidth. I am wondering what the specific number "over half a terabyte" means.
540
Curious, what are others using local LLMs on a MBP for? Hobby?
Need more memory, 256GB will be nice. MistralLarge is 123B. Can't even give a quantized Llama405B a drive. LLM users rejoice. LLM power users, weep.
The weird thing about these Apple product videos in the last few years is that there are all these beautiful shots of Apple's campus with nobody there other than the presenter. It's a beautiful stage for these videos, but it's eerie and disconcerting, particularly given Apple's RTO approach.
I think it’s usually filmed on weekends
You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
The Apple brand is - foundationally - pretty solitary.
Think about the early ipod ads, just individuals dancing to music by themselves. https://www.youtube.com/watch?v=_dSgBsCVpqo
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
It's a strange conflict. So much of their other stuff is about togetherness mediated by technology (eg, facetime). And their Jobs-era presentations always ended with a note of appreciation for the folks who worked so hard to make the launch happen. But you're right that much of the brand imagery is solitary, right up to the whole "Here's to the crazy ones" vibe.
It's weirdly dystopian. I didn't realize it bothered me until moments before my comment, but now I can't get it out of my head.
"Here's to the crazy ones" is weirdly dystopian? :)
If only in some shots, but they are such a valuable company that they simply cannot afford the risk of e.g. criticism for the choice of people they display, or inappropriate outfits or behaviour. One blip from a shareholder can cost them billions in value, which pisses off other shareholders. All of their published media, from videos like this to their conferences, are highly polished, rehearsed, and designed by committee. Microsoft and Google are the same, although at least with Google there's still room for some comedy in some of their departments: https://youtu.be/EHqPrHTN1dU
> You would just think that with a brand so intrinsically wrapped around the concept of technology working for and with the people that use it, you'd want to show the people who made it if you're going to show the apple campus at all.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
There are legal issues with it too, or at least they think there are. They take down developer presentations after a few years partly so they won't have videos of random (ex-)employees up forever.
What legal issues could arise from a recording of an employee publicly representing the company?
Privacy and likeness rights, that kind of thing. And licenses expiring on stock photos or whatever's in the background.
> the concept of technology working for and with the people that use it
> liminal space horror
reminds me of that god awful crush commercial
I had not seen that one, so I looked it up.
This was reminder to me that art is subjective. I don’t get the outrage. I kinda like it.
they apologized for that one.
I interviewed there in 2017 and honestly even back then the interior of their campus was kind of creepy in some places. The conference rooms had this flat, bland beige that reminded me of exactly the kind of computers the G3 era was trying to get away from, but the size of a room, and you were inside it.
The Mac mini video from yesterday has employees: https://www.apple.com/105/media/us/mac-mini/2024/58e5921e-f4...
I used to think the videos with all of the drone fly-bys was cool. But in the last year or so, I've started to feel the same as you. Where are all the people? It's starting to look like Apple spent a billion dollars building a technology ghost town.
Surely the entire staff can't be out rock climbing, surfing, eating at trendy Asian-inspired restaurants at twilight, and having catered children's birthday parties in immaculately manicured parks.
Oh I think they're very well done and very pretty! But lately this discomfort has started to creep in, as you note. Like something you'd see in a WALL-E spinoff: everyone has left the planet already but Buy n Large is still putting out these glorious promo videos using stock footage. Or, like, post-AI apocalypse, all the humans are confined to storage bins, but the proto-AI marketing programs are still churning out content.
The neighboring city charges $100k per newly constructed unit for park maintenance fees. So there actually are a lot of nice parks.
https://x.com/maxdubler/status/1778841932141408432
Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.No benchmarks yet, but this article gives some tables of comparative core counts, max RAM and RAM bandwidths: https://arstechnica.com/apple/2024/10/apples-m4-m4-pro-and-m...
I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
> I'm pleased that the Pro's base memory starts at 16 GB, but surprised they top out at 32 GB:
That's an architectural limitation of the base M4 chip, if you go up to the M4 Pro version you can get up to 48GB, and the M4 Max goes up to 128GB.
The new mac mini also has an M4 Pro that goes up to 64GB.
The "base level" Max is limited at 36GB. You have to get the bigger Max to get more.
The M4 tops off at 32 GB
The M4 Pro goes up to 48 GB
The M4 Max can have up to 128 GB
It seems you need the M4 Max with the 40-core GPU to go over 36GB.
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
Somehow I got downvoted for pointing this out, but it's weird that you have to spend an extra $800 USD just to be able to surpass 48gb, and "upgrading" to the base level Max chip decreases your ram limit, especially when the M4 Pro on the Mac Mini goes up to 64gb. Like... that's a shit load of cash to put out if you need more ram but don't care for more cores. I was really hoping to finally upgrade to something with 64gb, or maybe 96 or 128 if it decreased in price, but it's they removed the 96 and kept 64 and 128 severely out of reach.
Do I get 2 extra CPU cores, build a budget gaming PC, or subscribe to creative suite for 2.5 years!?
It doesn't look this cut and dry.
M4 Max 14 core has a single option of 36GB.
M4 Max 16 core lets you go up to 128GB.
So you can actually get more ram with the Pro than the base level Max.
Interesting tidbit: MacBook Airs also now start at 16GB. Same price!
I haven't done measurements on this, but my Macbook Pro feels much faster at swapping than any Linux or Windows device I've used. I've never used an M.2 SSD so maybe that would be comparable, but swapping is pretty much seamless. There's also some kind of memory compression going on according to Activity Monitor, not sure if that's normal on other OSes.
Yes, other M.2 SSDs have comparable performance when swapping, and other operating systems compress memory, too — though I believe not as much as MacOS.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
Degrading the SSD is a good point. This is thankfully a work laptop so I don't care if it lives or dies, but it's something I'll have to consider when I eventually get my own Mac.
On the standard M4 processor. If you move the M4 Pro it tops out at 48gb or moving to the M4 Max goes up to 128gb.
Weird that the M4 Pro in the Mac mini can go up to 64GB. Maybe a size limitation on the MBP motherboard or SOC package?
Probably just Apple designing the pricing ladder.
It looks like different versions of the ‘Pro’ based on core count and memory bandwidth. Im assuming the 12c Mini M4 Pro has the same memory bandwidth/channels enabled as the 14c MBP M4 Pro, enabling the 64GB. My guess would be related to binning and or TDP.
The 96GB RAM option of the M3 Max disappeared.
The max memory is dependent on which tier M4 chip you get. The M4 max chip will let you configure up to 128gb of ram
It looks like the 14 core M4 Max only allows 36GB of ram. The M4 Pro allows for up to 48GB. It's a bit confusing.
Can anyone comment on the viability of using an external SSD rather than upgrading storage? Specifically for data analysis (e.g. storing/analysing parquet files using Python/duckdb, or video editing using divinci resolve).
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
Don't bother with thunderbolt 4, go for USB 4 enclosure instead - I've got a Jeyi one. Any SSD will work, I use a Samsung 990 pro inside. It was supposed to be the fastest you can get - I get over 3000MB/s.
Here is the rabbit hole you might want to check out: https://dancharblog.wordpress.com/2024/01/01/list-of-ssd-enc...
Though TB5 should be better. I think you can already find some of these on Aliexpress.
https://www.winstars.com/en_us/category/Thunderbolt_5.html
It's totally fine.
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
https://www.owc.com/solutions/envoy-ultra
I'm a little sus of owc these days, their drives are way expensive, never get any third-party reviews or testing, and their warranty is horrible (3 years). I've previously swore by them so it's a little disappointing
The only OWC product I own is a TB4 dock and so far it has been rock solid.
Hopefully in the next days/weeks we’ll see TB5 external enclosures and you’ll be able to hit very fast speeds with the new Macs. I would wait for those before getting another enclosure now.
Afaik the main oem producer is Winstars, though I could only find sketchy-looking Aliexpress seller so far.
https://www.winstars.com/en_us/category/Thunderbolt_5.html
> Also, any recommendations for suitable ssds, ideally not too expensive?
I own a media production company. We use Sabrent Thunderbolt external NVMe TLC SSDs and are very happy with their price, quality, and performance.
I suggest you avoid QLC SSDs.
Basically any good SSD manufacturer is fine, but I've found that the enclosure controller support is flaky with Sonoma. Drives that appear instantly in Linux sometimes take ages to enumerate in OSX, and only since upgrading to Sonoma. Stick with APFS if you're only using it for Mac stuff.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
The USB-C ports should be quite enough for that. If you are using a desktop Mac, such as an iMac, Mini, or the Studio and Pro that will be released later this week, this is a no-brainer - everything works perfectly.
i go with the acasis thunderbolt enclosure and then pop in an nvme of your choice, but generic USB drives are pretty viable too ... thunderbolt can be booted from, while USB can't
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
I've used a Samsung T5 SSD as my CacheClip location in Resolve and it works decently well! Resolve doesn't always tolerate disconnects very well, but when it's plugged in things are very smooth.
With a thunderbolt SSD you'll think your external drive is an internal drive. I bought one of these (https://www.amazon.com/gp/product/B0BGYMHS8Y) for my partner so she has snappy photo editing workflows with Adobe CC apps. Copying her 1TB photo library over took under 5 min.
I had a big problem with crucial 4tb ssds recently, using them as time machine drives. The first backup would succeed, the second would fail and the disk would then be unrepairable in disk utility, which also will refuse to format to non-apfs (and an apfs reformat wouldn't fix it).
Switched to samsung t9s, so far so good.
I edit all my video content from a USB-attached SSD with Resolve on my MBP.
My only complaint is that Apple gouges you for memory and storage upgrades. (But in reality I don't want the raw and rendered video taking up space on my machine).
Run your current workload on internal storage and check how fast it is reading and writing.
For video editing - even 8K RAW - you don't need insanely fast storage. A 10GBit/s external SSD will not slow you down.
Get something with Thunderbolt and you’ll likely never notice a difference
> All MacBook Pro models feature an HDMI port that supports up to 8K resolution, a SDXC card slot, a MagSafe 3 port for charging, and a headphone jack, along with support for Wi-Fi 6E and Bluetooth 5.3.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
* https://en.wikipedia.org/wiki/Wi-Fi_7
* https://en.wikipedia.org/wiki/Wi-Fi_6E
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I was quite surprised by this discrepancy as well (my new iPhone has 7, but the new MBP does not).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
they hold their value well so you could buy it this year and sell it next year when you buy the new one. you'd probably only lose ~$500
Good point! I hadn’t looked at how resale value holds up. Maybe I will do that after all… thanks for the suggestion!
Yeah, I thought that was weird. None of the Apple announcements this week had WiFi7 support, just 6E.
https://www.tomsguide.com/face-off/wi-fi-6e-vs-wi-fi-7-whats...
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
It looks like few people only are using Wifi 7 for now. Maybe they are going to include it in the next generation when more people will use it.
> It looks like few people only are using Wifi 7 for now.
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
The lack of Wifi7 is a real bummer for me. I was hoping to ditch the 2.5Gbe dongle and just use WiFi.
Hm why? Is 6E really so much worse than 7 in practice that 7 can replace wired for you but 6E can't? That's honestly really weird to me. What's the practical difference in latency, bandwidth or reliability you've experienced between 6E and 7?
I don’t have any 6E device so I cannot really tell for sure but from what I read, 6E gets you to a bit over 1Gbit in real world scenario. 7 should be able to replace my 2.5Gbe dongle or at least get much closer to it. I already have routers WiFi 7 Eeros on a 2.5Gbe wired backbone.
I guess it makes sense if what you do is extremely throughput-focused... I always saw consistency/reliability and latency as the benefits of wired compared to wireless, the actual average throughput has felt fast enough for a while on WiFi but I guess other people may have different needs
Yeah, this threw me as well. When the iMac didn’t support WiFi 7, I got a bit worried. I have an M2, so not going to get this, but the spouse needs a new Air and I figure that everything would have WiFi 7 by then, and now I don’t think so.
Faster is always nice, makes sense. But do you really need WiFi 7 features/speed? I don't know when I would notice a difference (on a laptop) between 600 or 1500 Mbit/s (just as an example). Can't download much anyhow as the storage will get full in minutes.
> But do you really need WiFi 7 features/speed?
One of the features is preamble punching, which is useful in more dense environments:
* https://community.fs.com/article/how-preamble-puncturing-boo...
* https://www.ruckusnetworks.com/blog/2023/wi-fi-7-and-punctur...
MLO helps with resiliency and the improved OFDMA helps with spectrum efficiency as well. It's not just about speed.
Thanks for those explainers.
Call of Duty is 200GB
Wifi 6 can do up to 4.8Gbps. Even at half of that, you're going to be limited by a 2Gbps fiber line.
The real use is transferring huge files within the LAN.
How frequently are you downloading CoD on your Mac?
This is the first compelling Mac to me. I've used Macs for a few clients and muscle memory is very deeply ingrained for linux desktops. But with local LLMs finally on the verge of usability along with sufficient memory... I might need to make the jump!
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
You totally can after a little bit of time waiting for M4 bringup!
https://asahilinux.org
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
Asahi doesn't work on M3 yet after a year. It's gonna be a bit before M4 support is here.
IIRC one of the major factors holding back M3 support was the lack of a M3 mini for use in their CI environment. Now that there's an M4 mini hopefully there aren't any obstacles to them adding M4 support
Why would that matter? You can use a MacBook in CI too?
How? What cloud providers offer it? MacStadium and AWS don't.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
Cloud providers don't seem too relevant to a discussion of CI for kernel and driver development.
Why not?
How do you imagine that a cloud computing platform designed around running Macs with macOS would work for testing an entirely different OS running on bare metal on hardware that doesn't have a BMC, and usefully catching and logging frequent kernel panics and failed boots?
It's a pretty hard problem to partially automate for setups with an engineer in the room. It doesn't sound at all feasible for an unattended data center setup that's designed to host Xcode for compiling apps under macOS.
GitHub’s self hosted runners are as painless as they can get, and the Mac Mini in my basement is way faster than their hosted offering.
I meant using a physical device indeed.
"a little bit of time" is a bit disingenuous given that they haven't even started working on the M3.
(This isn't a dig on the Asahi project btw, I think it's great).
I miss Linux, it respected me in ways that MacOS doesn't. But maintaining a sane dev environment on linux when my co-workers on MacOS are committing bash scripts that call brew... I am glad that I gave up that fight. And yeah, the hardware sure is nice.
IIRC brew supports linux, but it isn't a package manager I pay attention to outside of some very basic needs. Way too much supply chain security domain to cover for it!
It does, but I prefer to keep project dependencies bound to that project rather than installing them at wider scope. So I guess it's not that I can't use Linux for work, but that I can't use Linux for work and have it my way. And if I can't have it my way anyway, then I guess Apple's way will suffice.
macOS virtualization of Linux is very fast and flexible. Their sample code shows it's easy without any kind of service/application: https://developer.apple.com/documentation/virtualization/run...
However, it doesn't support snapshots for Linux, so you need to power down each session.
I've been lightly using ollama on the m1 max and 64gb RAM. Not a power user but enough for code completions.
Off topic, but I’m very interested in local LLMs. Could you point me in the right direction, both hardware specs and models?
In general for local LLMs, the more memory the better. You will be able to fit larger models in RAM. The faster CPU will give you more tokens/second, but if you are just chatting with a human in the loop, most recent M series macs will be able to generate tokens faster than you can read them.
That also very much depends on model size. For 70B+ models, while the tok/s are still fast enough for realtime chat, it's not going to be generating faster than you can read it, even on Ultra with its insane memory bandwidth.
https://www.reddit.com/r/LocalLLaMA/ https://www.reddit.com/r/SillyTavernAI/
Thanks to both of you!
Have a look at ollama? I think there is a vscode extension to hook into local LLM if you are so inclined: https://ollama.com/blog/continue-code-assistant
Get as much RAM as you can stomach paying for.
You can spin up a Unix OS. =) It’s even older than Linux.
NextSTEP which macOS is ultimately based on is indeed older than Linux (first release was 1989). But why does that matter? The commenter presumably said "Linux" for a reason, i.e. they want to use Linux specifically, not any UNIX-like OS.
Sure. But not everybody. That’s how I ended up on a Mac. I needed to develop for Linux servers and that just sucked on my windows laptop (I hear it’s better now?). So after dual booting fedora on my laptop for several months I got a MacBook and I’ve never looked back.
BSD is fun (not counting MacOS in the set there), but no, my Unix experiences have been universally legacy hardware oversubscribed and undermaintained. Not my favorite place to spend any time.
Check out Asahi linux
It seems they also update the base memory on MacBook Air:
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
Wow, I didn't expect them to update the older models to start at 16GB and no price increase. I guess that is why Amazon was blowing the 8GB models out at crazy low prices over the past few days.
Costco was selling MB Air M2 8 GB for $699! Incredible deal.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
I'm waiting for the 16 GB M2 Air to be super cheap to pick one up to use with Asahi Linux!
I was seeing $699 MB Air M1 8 GB on Amazon India a week ago.
But no update to a M4 for the MacBook Air yet unfortunately. I would love to get an M4 MacBook Air with 32GB.
I believe the rumor is that the MacBook Air will get the update to M4 in early spring 2025, February/March timeline.
This is the machine I'm waiting for. Hopefully early 2025
There are still a couple days left this week.
They said there would be three announcements this week and this is the third
They did? The tweet that announced stuff from the head of marketing did not mention 3 days.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
"This is a huge week for the Mac, and this morning, we begin a series of three exciting new product announcements that will take place over the coming days," said Apple's hardware engineering chief John Ternus, in a video announcing the new iMac.
Ah, thanks. I was referring to last weeks Tweet. I didn’t watch the iMac video.
That's disappointing. I was expecting a new Apple TV because mine needs replacement and I don't really feel inclined to get one that's due for an upgrade very soon.
Also, Studio and Pro are hanging there.
The current-gen Apple TV is already overpowered for what it does, and extremely nice to use. I can think of very few changes I would like to see, and most of them are purely software.
I really wish it had some way to connect USB storage directly.
Mine has 128GB of onboard storage... but Apple still bans apps from downloading video, which annoys me.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
I use a dedicated NAS as a Plex server + Plex app on Apple TV itself for local streaming, which generally works fine. Infuse app can also index and stream from local sources.
But there are some cases like e.g. watching high-res high-FPS fractal zoom videos (e.g. https://www.youtube.com/watch?v=8cgp2WNNKmQ) where even brief random skipped frames from other things trying to use WiFi at the same time can be really noticeable and annoying.
I do strongly recommend using Ethernet, unless you have the WiFi-only model, but gotcha.
There really isn't a chance they'll update the same product twice in a week.
They haven't officially updated it. They just discontinued the smaller model.
It would make more sense to discontinue the smaller model along with some other updates to the line. Or in other words, Air won't receive any other updates this week unfortunately.
Given that the Mini and iMac have received support for one more additional external display (at 60Hz 6K), I hope we’ll see the same on the MBA M4.
The big question for me is whether they will have a matte option for the Air. I want a fanless machine with a matte screen.
Unfortunately Apple won’t tell you until the day they sell the machines.
1TB+ iPad Pro can be a fanless machine with a matte screen
See that’s the thing. Given that somehow you need 1TB to get the matte screen, I feel like Apple is using it as a way to upsell. It would indicate that perhaps Apple won’t offer a matte MacBook Air.
It'll be interesting to see the reaction of tech commentators about this. So many people have been screaming at Apple to increase the base RAM and stop price gouging their customers on memory upgrades. If Apple Intelligence is the excuse the hardware team needed to get the bean counters on board, I'm not going to look a gift horse in the mouth!
So we can scream about the lousy base storage, which is the same as my phone. Yikes.
It wouldn't surprise me if people typically use more storage on their phone than their computer. The phone should probably have a higher base storage than the base storage of their laptops.
Extremely interesting point. My initial reaction to your comment is that it is a crazy thing to say, but the more I think about it the more I agree with you. On my phone is where I have tons of 4k 30/60FPS videos, high resolution photos (with live), and videos downloaded on Netflix and YouTube.
On my Mac I don't have any of these things, it's mostly for programming and some packages. I'm almost always connected to Wi-Fi (except on planes) so I don't really need any photos or videos.
The only people that I see have high storage requirements on Macs are probably video/media creators? As a programmer I'm totally fine with 512GB, but could probably live with 256GB if I wanted to be super lean.
But still just 256GB SSD Storage. £200 for the upgrade to 512GB (plus a couple more GPU cores that I don't need. Urgh.
It’s stationary. Just get a Thunderbolt NVMe drive and leave it plugged in
Why buy a laptop then if you're lugging all those external hard drives?
Just invest in the model with more storage then?
Right, and back round we go: £200 for that is terrible value.
And it's still only 512GB! The M4 version coming in the new year will surely bump this up to something more sensible.
Thanks to your comment. I persuaded my friend who purchased an M3 Air 24GB recently and we got 200$ back (Remuneration for price drop valid for 14 days after the date of DELIVERY) where we live
Every M-series device now comes with at least 16GB, except for the base iPad Pro, right?
Correct, every Mac computer starts at 16gb now. 256gb/512gb iPad Pro is 8gb, 1tb/2tb is 16gb.
At least all the M4 Macs. I’m not sure of every older M config has been updated, though at least some have been.
The only older configs that Apple sells are the M2 and M3 Airs, which were bumped. Everything else is now on M4, or didn't have an 8gb base config (Mac Studio, Mac Pro)
Ohh, good catch. Sneaking that into the MBP announcement. I skimmed the page and missed that. So a fourth announcement couched within the biggest of the three days.
If only they would bring back the 11" Air.
I've seen a lot of people complaining about 8GB but honestly my min spec M1 Air has continued to be great. I wouldn't hesitate to recommend a refurb M1 8GB Air for anyone price conscious.
> while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop
Only in US it seems. India got a price increase by $120.
Just cancelled my order for a 24GB MacBook Air 15-inch, and then ordered the exact same setup. Saved around $300!
Yeah, this one caught me off guard. We just purchased a MacBook Air in the last month and if we bought the same one now, we would save $200. Apple support would not price match/correct that, so we will be returning it and purchasing anew.
I guess that implies the MacBook Air won't be updated this week.
Makes me wonder what else will be updated this week (Studio or Mac Pro)?
Great news. The pro is kinda of heavy for my liking so the Air is the way to go
It's not just the weight - Air is also fanless (and still runs cold).
And yes, with enough RAM, it is a surprisingly good dev laptop.
Really too bad you cannot upgrade to 32GB RAM though =(
I think spec-wise the Air is good enough for almost everyone who isn't doing video production or running local LLMs, I just wish it had the much nicer screen that the Pro has. But I suppose they have to segregate the product lines somehow.
Well, the issue for me with memory on these new models is that for the Max, it ships with 36GB and NO expandable memory option. To get more memory that's gated behind a $300 CPU upgrade (plus the memory cost).
Im sorry but any laptop that costs $1000 should come with 64 gigs minimum, or expandable slots.
Can you actually point out any retail laptops with that spec for that price?
Tell me you are poor without telling me you are poor.
Just kidding! As an Apple Shareholder I feel like you should take what Apple gives you and accept the price. ;)
Nano-texture option for the display is nice. IIRC it's the first time since the 2012 15" MBP that a matte option has been offered?
I hope that the response times have improved, because it has been quite poor for a 120 Hz panel.
My one concern is that nano-texture apple displays are a little more sensitive to damage, and even being super careful with my MBPs I get the little marks from the keyboard when you carry the laptop with your hand squeezing the lid and bottom (a natural carry motion).
Put a facial tissue over keyboard before closing the lid.
Macs - they just work.
Haha, yeah, but it applies to all laptops really. Keys inevitably scratch the display.
> IIRC it's the first time since the 2012 15" MBP that a matte option has been offered?
The so-called "antiglare" option wasn't true matte. You'd really have to go back to 2008.
Love the nano-texture on the Studio Display, but my MacBooks have always suffered from finger oil rubbing the screen from the keys. Fingerprint oil on nano-texture sounds like a recipe for disaster.
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
The iPad has nano texture and I find it does a much better job with oily fingerprints.
I put a thin screen cleaner/glasses cleaner cloth on the keyboard whenever I close the lid. That keeps the oils off the screen as well as prevents any pressure or rubbing from damaging the glass.
I tried that, unfortunately didn't work for me at all.
It's also on the iPad Pro. Only downside is you really do need the right cloth to be able to clean it.
I believe the laptop ships with the cloth. That said, it is annoying to have to remember to always keep that one cloth with your laptop.
They brought back the matte screen! Omg. The question is, will they have that for the air.
(I tend to feel if you want something specialized, you gotta pay for the expensive model)
Yes. It's finally back.
If I remember correctly, the claim was that M3 is 1.6x faster than M1. M4 is now 1.8x faster than M1.
It sounds more exciting than M4 is 12.5% faster than M3.
If your goal is to sell more MBPs (and this is marketing presentation) then, judging by the number of comments that have the phrase "my M1" and the top comment, it seems like M1 vs M4 is the right comparison to make. Too many people are sticking with their M1 machines. Including me.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
There aren't that many people that upgrade something like an MBP every year, most of us keep them longer than that.
I've just ordered an (almost) top-of-the-range MBP Max, my current machine is an MBP M1-max, so the comparisons are pretty much spot-on for me.
Selling the M1 Ultra Studio to help pay for the M4 MBP Max, I don't think I need the Studio any more, with the M4 being so much faster.
Most people buying a new MacBook don’t have the previous version, they’re going much further back. That’s why you see both intel and m1 comparisons.
No it isn't. It's because 1.8x faster sounds better than 12% faster.
Back when Moore's law was still working they didn't skip generations like this.
Back when Moores las was still working they didn't release three subsequent versions of the same product in 22 months.
The M1 was released 4 years ago.
Both the M2 and M3 MBP were released in 2023.
Looking at https://en.wikipedia.org/wiki/Apple_M4#Comparison_with_other...
M4 is built with TSMC's 2nd Gen 3nm process. M3 is on the 1st gen 3nm.
For the base M3 vs base M4:
- the CPU (4P+4E) & GPU (8) core counts are the same
- NPU perf is slightly better for M4, I think, (M4's 38TOPS @ INT8 vs M3's 18TOPS @ INT16)
- Memory Bandwidth is higher for M4 (120 GB/s vs 102.4 GB/s)
- M4 has a higher TDP (22W vs 20W)
- M4 has higher transistor count (28B vs 25B)
It does and it gets even worse when you realize those stats are only true under very specific circumstances, not typical computer usage. If you benchmarked based on typical computer usage, I think you'd only see gains of 5% or less.
Anyone know of articles that deep dive into "snappiness" or "feel" computer experiences?
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
https://nanoreview.net/en/cpu-compare/apple-m1-vs-amd-ryzen-...
So far I’m only reading comments here about people wow’d by a lot of things it seemed that M3 pretty much also had. Not seeing anything new besides “little bit better specs”
The M4 is architecturally better than the M3, especially on GPU features IIRC, but you’re right it’s not a total blow out.
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
Yes, upgrading from a m3 max to a m4 max would be a waste.
They actually omitted M2 from a lot of the comparisons, which isn't surprising because M3 was only 10-15% faster.
Maybe they are highlighting stats which will help people upgrade. Few will upgrade from M3 to M4. Many from M1 to M4. That's my guess.
I have to admit, 4 generations in, 1.8x is decent but slightly disappointing all the same.
I'd really like to justify upgrading, but a $4k+ spend needs to hit greater than 2x for me to feel it's justified. 1.8x is still "kind of the same" as what I have already.
What's the consensus regarding best MacBooks for AI/ML?
I've heard it's easier to just use cloud options, but I sill like the idea of being able to run actual models and train them on my laptop.
I have a M1 MacBook now and I'm considering trading in to upgrade.
I've seen somewhat conflicting things regarding what you get for the money. For instance, some reports recommending a M2 Pro for the money IIRC.
Training is not practical. For inference they're pretty great though, especially if you go up in the specs and add a bunch of memory.
To run LLMs locally (Ollama/LLM Notebook), you want as much memory as you can afford. For actually training toy models yourself for learning/experiments in my experience it doesn't matter much. PyTorch is flexible.
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
This is nice, and long overdue.
I’m really excited about the nano-texture display option.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
Do you actually have to wipe the screen with the included special cloth? The screen on all of the macbooks that I've had usually get oily patches because of the contact with keycaps, so I have to wipe the screen regularly.
I wipe all my devices with regular paper towels with a tad of water. Including my $5k Apple XDR display.
I am probably not the best example to emulate lol.
I have Pro Display XDR with nano coating and the manual definitely says to only use their special cleaning cloth (or with some isopropyl alcohol). The standard coating might not need it though.
How is the contrast? The HDR content? Any downsides?
I will upgrade to M4 Pro and really hate the glare when I travel (and I do that a lot) but at the same time I don't want to lose any quality that the MBP delivers which is quite excellent imho
Does it make much difference for looking at code?
Yes, the main goal is to be easier on the eyes IMO.
It's easier to read on it.
Just replaced for the first time battery on my Macbook Pro 2015 Retina. Feel so good using such an old piece of hardware.
I love mine, it has a fresh battery OEM battery as well. Runs the latest OS with OpenCore Legacy. But it's starting to get a bit annoying. Usable, but it is starting to feel slowish, the fan kicks up frequently.
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
If I still had my 2015 I would have applied some liquid metal TIM by now, I did a paste refresh and that worked very well to get the fan under control.
If it's got a full function row, it will probably work just fine under Linux. My 2014 MBP chugged pretty hard with OpenCore but handles modern Linux distros much better.
Same, the jump to the last few OS versions is not pleasant. Do you get good battery life on Linux with it?
How long is your battery life? I've replaced the battery 2 times in my MBP 2015 (iFixit battery) but battery life is only 2.5 hours and I'm perplexed.
Which MacOS version? I upgraded to a newer one and it crawled to a halt, it's unusable now. UI is insanely laggy. It's sitting in a drawer gathering dust now
I have a 16" M1 Pro with 16 gigs of ram, and it regularly struggles under the "load" of Firebase emulator.
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
This kind of sounds like someone is abusing perf cores and high priority threading in your stack. iirc, on MacOS audio workgroup threads are supposed to be scheduled with the highest (real time) priority on p cores, which shouldn't have issues under load, unless someone else is trying to compete at the same priority.
There is some discussion online on whether this happens when you have a Rosetta app running in the background somewhere (say a util you got via Homebrew, for example).
Even when I remove all "Intel" type apps in activity monitor, I still experience the issue though.
I have a 14" M1 Max with 32gb of ram for work, and it does that popping noise every once it a while too! I've always wondered what was causing it.
Im relatively surprised modern Macs have same buffer underrun issue I had on intel laptops with pulseaudio 7+ years back.
This happens whenever I load up one of our PyTorch models on my M1 MBP 16gb too. I also hate the part where if the model (or any other set of programs) uses too much RAM the whole system will sometimes straight up hang and then crash due to kernel watchdog timeout instead of just killing the offender.
There is an API `proc_setpcontrol` which absolutely noone uses which does the thing you want.
It definitely gets unstable in those situations, but you probably don't want your scripts randomly OOM killed either.
> There is an API `proc_setpcontrol` which absolutely noone
Gee, I wonder why.
Sorry what is the thing I want in this case? No stuttering or no crashing?
Killing the process instead of affecting (or crashing) the rest of the system.
I hear popping when Chrome opens a big image or something similar. I always assumed it was a Chrome issue.
I’ve had something similar happen as a bug when I was using a python sound device and calling numpy functions inside its stream callback. Took me a long time to figure out that numpy subroutines that drop the GIL would cause the audio stream to stall.
Whoa! I've been so annoyed by this for years, so interesting that you figured it out. It's the kind of inelegance in design that would have had Steve Jobs yelling at everyone to fix, just ruins immersion in music and had no obvious way to fix.
That sounds like an app issue, it might be doing non-realtime-safe operations on a realtime thread. But generally speaking, if you have an issue, use feedback assistant.
It happens in Safari and Music.app. I find that Airpods exacerbate the issue too.
> … advanced 12MP … camera
wot, m8? Only Apple will call a 12 megapixel camera “advanced”. Same MPs as an old iPhone 6 rear camera.
Aside from that, it’s pretty much the same as the prior generation. Same thickness in form factor. Slightly better SoC. Only worth it if you jump from M1 (or any Intel mbp) to M4.
Would be godlike if Apple could make the chip swappable. Buy a Mac Studio M2 Ultra Max Plus. Then just upgrade SoC on an as needed basis.
Would probably meet their carbon neutral/negative goals much faster. Reduce e-waste. Unfortunately this is an American company and got to turn profit. Profit over environment and consumer interests.
You’re comparing cameras against different product segments.
Laptop cameras are significantly smaller in all dimensions than phone cameras. Most laptop cameras are 1-4MP. Most are 720p (1MP), and a few are 1080p (2MP). The previous MacBook was 1080p
For reference, a 4k image is 8MP.
12MP is absolutely a massive resolution bump, and I’d challenge you to find a competitive alternate in a laptop.
I feel like if they pushed Win32/Gaming on Apple Mx hardware it'd give at least a single reason for people to adopt or upgrade their devices to new models. I know for sure I'd be on board if everything that ran on my steam deck ran on a mac game wise, since that's holding me back from dropping the cash. I still think I'll get a mini though.
Valve is trying to obsolete Windows, so they can prevent Microsoft from interfering with Steam. Apple could team up with them, and help obsolete Windows for a very large percentage of game-hours.
There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.
My only explanations for the lack of gaming support (see historical lack of proper OpenGL support) while still supporting high end graphics use cases (film editing, CAD, visual effects) are:
1) Either Apple wants to maintain the image of the Macbook as a "serious device", and not associate itself with the likes of "WoW players in their mom's basement".
2) Microsoft worked something out with Apple, where Apple would not step significantly on the gaming market (Windows, Xbox). I can't think of another reason why gaming on iOS would be just fine, but abysmal on MacOS. Developers release games on MacOS _despite_ the platform.
Steve Jobs was historically against gaming on apple devices and, I believe, went so far as to try to remove them from the Apple Store. Apple is only recently starting to introduce gaming seriously back into the platform.
Would be incredibly fascinating to consider what if Bungie was never bought by Microsoft and Halo ended up a Mac title first. It would've severely capped the influence of the game (and maybe its quality), even after it would have been ported to PC. Would Halo have even been imported to Xbox? On the flip side, if it somehow managed to capture considerable success- would it have forced Jobs and Apple to recognize the importance of the gaming market? Either way, the entire history of video games would be altered.
It's funny because they directly advertise performance in WoW in M4 presskit https://imgur.com/CoBGQ0b
The Thinkpad webcam is only 5MP. Many other PCs have much less.
More megapixels on a tiny sensor does not make it more advanced. At a certain point it only makes it worse. That doesn't tell you anything about the quality of the image. There is way more to digital cameras than pixel count.
Especially because pixel count is a meaningless metric by itself. 12MP is the same as a Nikon D3, which if it could replicate the results of I would be happy with!
Megapixels is nothing more than the number of sample points. There's so much more to image quality than the number of samples.
I blame the confusion to PC&Android marketing people who were pushing for years and years the idea that the higher the megapixel digits the better the camera is. Non-Apple customers should be really pissed of for the years of misinformation and indoctrination on false KPI.
The marketing gimmicks pushed generations of devices to optimize for meaningless numbers. At times, even Apple was forced to adopt those. Such a shame.
Not pixel count determines whether camera is advanced or not.
> "Up to 7x faster image processing in Affinity Photo"
Great to see Affinity becoming so popular that it gets acknowledged by Apple.
Affinity has been mentioned many times by Apple in their product videos
I’m a fan of their software and pricing.
Affinity is now part of Canva. https://www.canva.com/newsroom/news/affinity/
They're really burying the lede here - magic trackpad and magic keyboard finally have USB-C :)
That was announced on Monday, with the new iMacs.
That's annoying. I really want to fully remove lightning connectors from my life, but, my existing magic* devices work fine and will probably work fine for another decade or two.
It pains me deeply that they used Autodesk Fusion in one of the app screenshots. It is by far the worst piece of software I use on Mac OS.
Wish the nano-texture display was available when I upgraded last year. The last MacBook I personally bought was in 2012 when the first retina MBP had just released. I opted for the "thick" 15" high-res matte option. Those were the days...
Wait really? I love Fusion 360. I suppose I use it on Windows though. Is it significantly worse on Mac?
Oh its awful on the Mac.
I hoped for at least 192GB of RAM for LLMs as 48GB DDR5 are pretty normal nowadays.
They want you to buy a Mac Pro or Studio for that
These chips are incredible. Even my M1 MBP from 2020 still feels so ridiculously fast for everyday basic use and coding.
Is an upgrade really worth it?
I don’t think it will “feel” much faster like the Intel -> M1 where overall system latency especially around swap & memory pressure got much much better.
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I have a MBP M1 16GB at home, and a MBP M3 128GB at work. They feel the same: very fast. When I benchmark things I can see the difference (or when fiddling with larger LLM models), other than that, the M1 is still great and feels faster and more enjoyable than any Windows machine I interact with.
I guess it’s only worth it for people who would really benefit from the speed bump — those who push their machines to the limit and work under tight schedules.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
I do a lot of (high-end mirrorless camera, ~45MP, 14 bits/pixel raw files) photo processing. There are many individual steps in Photoshop, Lightroom, or various plug-ins that take ~10 seconds on my M1 Max MBP. It definitely doesn't feel fast. I'm planning to upgrade to one of these.
How viable is Asani Linux these days? MacBook hardware looks amazing.
No support for M3 or M4 powered machines currently.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
[^1]: https://asahilinux.org/about/
[^2]: https://asahilinux.org/fedora/#device-support
btw, there is a recent interview with an Asani dev focusing on GPUs, worth a listen for those interested in linux on apple silicon. The reverse engineering effort required to pin down the GPU hardware was one of the main topics.
https://softwareengineeringdaily.com/2024/10/15/linux-apple-...
For many years I treated Windows or macOS as a hypervisor - if you love Linux but want the Mac hardware, instant sleep & wake, etc, putting a full screen VM in Parallels or similar is imo better than running Linux in terms of productivity, although it falls short on “freedom”.
I do the same thing, but there are two big caveats:
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
Nested virtualization needs at least an M3
https://developer.apple.com/documentation/virtualization/vzg...
This is the sad situation on my M2 MacBook Pro :(
Have anyone tried it recently, specifically the trackpad? I tried the Fedora variant a few months ago on my M1 Macbook and it was horrible to use the trackpad, it felt totally foreign and wrong.
I feel you, but Apple's trackpad prowess is not an easy thing to copy. It's one of those things I never expect anyone else to be able to replicate the level of deep integration between the hardware and software.
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
Insane cost for the amount of storage and RAM. I mean, year over year for Apple, awesome! Compared to the rest of the brands... so ridiculously expensive. Watching the price climb to 5K as you add in the new normal for hardware specs is absurd.
Nice to see they increased the number of performance cores in the M4 Pro, compared to the M3 Pro. Though I am worried about the impact of this change on battery life on the MBPs.
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
I'm curious why they decided to go this route, but glad to see it. Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
> Perhaps ~4 efficiency cores is simply just enough for the average MBP user's standard compute?
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
> Most times, the performance cores on my M3 Pro laptop remain idle.
Curious how you're measuring this. Can you see it in Activity Monitor?
> Maybe that’s the downside of having a better display.
Yes I think so. Display is a huge fraction of power consumption in typical light (browsing/word processing/email) desktop workloads.
> Curious how you're measuring this. Can you see it in Activity Monitor?
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
[1]: https://github.com/exelban/stats
Cool, thanks for the tip!
> Curious how you're measuring this. Can you see it in Activity Monitor?
Yes, processor history in the activity monitor marks out specific cores as Performance and Efficiency.
Example: https://i.redd.it/f87yv7eoqyh91.jpg
Wow, I didn't even realize you could double-click the CPU graph on the main screen to open that view.
Question to more senior Mac users: how do you usually decide when to upgrade?
I bought my first Macbook pro about a year and a half ago and it's still working great.
Ask 3 people, get 5 answers.
Got the money, are in the consumerism camp: Switch to latest model every year because the camera island changed 5mm.
Got the professional need in games or video and your work isn't covering your device: Switch to new model every couple of generations.
Be me: I want to extend the lifecycle of things I use. Learn how to repair what you own (it's never been as easy), be aware of how you can work in today's world (who needs laptop RAM if I can spin up containers in the cloud) - I expect to not upgrade until a similarly stellar step up in the category of Intel to Apple Silicone comes along.
All past Mx versions being mostly compared to Intel baselines: Boring. M4 1.8 times faster than M1 Pro: Nice, but no QoL change. For the few times I might need it, I can spin up a container in the cloud.
My display is excellent.
14 inch is the perfect screen size.
Battery life is perfect.
Great answer, thank you!
I update largely based on non performance criteria:
- new display tech
- better wireless connectivity
- updated protocols on ports (e.g., support for higher res displays and newer displayport/hdmi versions)
- better keyboard
- battery life
Once a few of those changes accumulate over 4+ generations of improvements that’s usually the time for me to upgrade.
My laptops so far: first 2008 plastic macbook, 2012 macbook pro, 2015 macbook pro, and M1 pro 16 currently. I skipped 2016-2020 generation which was a massive step backwards on my upgrade criteria, and updated to 2015 model in 2016 once I realized apple has lost their marbles and has no near plans on making a usable laptop at the time.
Also getting a maxed out configuration really helps the longevity.
On major remodels or with compelling features. I had an i9 MacBook Pro and then upgraded to anM1 MacBook Pro because it was a major leap forward. However, I will wait until the MacBook Pro is redesigned yet again (maybe thinner and lighter as I travel a lot and carry-on weight is limited), apparently in 2026 or so with OLED and other features, rumors say.
Pick a daily cost you’re comfortable with. If you’re contracting at say $500/day, how much are you willing to spend on having a responsive machine? $10? $20?
Multiply it out: 220 work days a year * $10/day is $2200 a year towards your laptop.
Upgrade accordingly.
Depends if it is a personal machine or paid by your company. 5+ years is what I generally expect from an apple laptop (been using them since around 2007-2009) if I own. For an M1-3 that could be a bit longer. If it is paid by your company, then whenever you have the budget :)
> 5+ years is what I generally expect
Sounds like a good rule of thumb.
Can I also ask what kind of work you do on it? I suspect that some work probably wears out computers faster than other sorts of work.
The 2014 model I bought in early 2015 still works, though the battery is dodgy. I did get the motherboard replaced in 2020 which was pricey, but much cheaper than a new machine.
Is there some reason your current computer isn't working for you? If not, why upgrade? Use it as long as you can do so practically & easily.
On the other extreme, I knew someone who bought a new MBP with maximum RAM specs each year. She'd sell the old one for a few hundred less than she paid, then she always had new hardware with applecare. It was basically like leasing a machine for $400/yr.
My previous Macbook was a Pro model from 2015, I waited 6 years to finally upgrade to an M1 Air because of the awful touchbar models they had in between (though I'm still using the 2015 Pro for personal stuff, in fact right now. It's upgraded to the latest macOS using OpenCore and it still runs great). But I would say upgrade every 3-5 years depending on heavy a professional user you are.
The touchbar was useful in one important way.
Because it made the esc key useless for touch typists and because, as a vi user, I hit esc approximately a bazillion times per day I mapped caps lock to esc.
Now my fingers don't travel as far to hit esc.
I still use that mapping even on my regular keyboards and my current non-touch-bar macs.
Thanks touchbar macs, rest in peace.
People have different passions, I like computers. If I feel a new Mac is going to be fun for whatever reason, I consider upgrading it. Performance wise they last a long time, so I could keep them way longer than I do, but I enjoy newer and more capable models. You can always find someone to buy the older model. Macs have a great second hand market.
I'm using them for several years - I still have a Mac mini (from 2012) and an iMac Pro (from 2017) running. I also get a company Macbook which I can upgrade every three years.
But there is also another strategy: get a new Mac when they come out and sell it before/after the next model appears. There is a large market for used Macs. A friend of mine has been doing this for quite some time.
When it stops working great. My 2014 Macbook is about due for an upgrade, mostly due to the GPU struggling with a 4K screen.
It's hard to imagine ay reason why I would not want to keep upgrading to a new MPB every few years -- my M3 MBP is by far the best laptop I've owned thanks to the incredible battery life.
Of course I'm rooting for competition, but Apple seems to be establishing a bigger and bigger lead with each iteration.
I don’t see the yearly releases as saying you have to upgrade. Rather, having a consistent cadence makes it easier for the supply chain, and the short iteration time means there’s less pressure to rush something in half-baked or delay a release.
My M1 laptop from early 2022 is too good for me to care about upgrading right now, I loaded it up with 64GB ram and it's still blazing. What benefit would I really notice? My heavy apps loading a couple of seconds faster?
What’s amazing is that in the past I’ve felt the need to upgrade within a few years.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
> Laptops in general are just better than they used to be, with modern CPUs and NVMe disks. I feel exactly the same seeing new mobile AMD chips too, I'm pretty sure I'll be happy with my Ryzen 7040-based laptop for at least a few years.
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
It's true that Linux may be a bit better in some cases, if you have a system that has good Linux support, but I think in most cases it should never make a very substantial difference. On some of the newer Intel laptops, there are still missing power management features anyways, so it's hard to compare.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
I boot Windows on my Mac M1 just fine. Just yesterday I played Age of Empires 3.
I changed the wording to be "booting directly" to clarify that I'm not including VMs. If I have to explain why that matters I guess I can, but I am pretty sure you know.
I am genuinely interested, why does it matter? The performance is more than good enough even to run a Visual Studio (not Code).
If the roles were reversed would you still need an explanation? e.g. If I could run macOS inside of a VM on Windows and run things like Final Cut and XCode with sufficient performance, would you think there's no benefit to being able to boot macOS natively?
Booting natively means you need real drivers, which don't exist for Windows on Mac as well as for macOS on PC. It'd be useless. Just use the VM, it's good.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
The question was a hypothetical. What if the macOS VM was perfect? If it was perfect, would it then not matter if you couldn't just boot into macOS?
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
I still don't understand why would you buy a Mac if you want to run Windows.
Exactly. You wouldn't.
I’ll agree the AMD laptops from the past couple of years are really impressive. They are fast enough that I’ve done some bioinformatics work on one.
Battery life is decent.
At this point I’m not switching from laptop Linux. The machines can even game (thanks proton/steam)
the office Ryzen thinkpads we have are ok...but they're definitely no M1 MacBook Air or Pro...
If we're mostly concerned about CPU grunt, it's really hard to question the Ryzen 7040, which like the M1, is also not the newest generation chip, though it is newer than the M1 by a couple of years. Still, comparing an M1 MacBook Pro with a Framework 16 on Geekbench:
https://browser.geekbench.com/macs/macbook-pro-14-inch-2021-...
https://browser.geekbench.com/v6/cpu/4260192
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
what about heat and noise?
those are another two reasons why I can't ignore Apple Silicon
Ultimately it'll be subjective, but the fans don't really spin up on my Framework 16 unless I push things. Running a game or compiling on all cores for a while will do the trick. The exact battery life, thermals and noise will be heavily dependent on the laptop; the TDP of modern laptop CPUs is probably mostly pretty comparable so a lot of it will come down to thermal design. Same for battery life and noise, depends a lot on things other than the CPU.
>Laptops in general are just better than they used to be, with modern CPUs and NVMe disks.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
Regarding Youtube, Google is also waging a silent war against Invidious. It's to the point that even running helper scripts to trick Youtube isn't enough (yet). I can't imagine battling active and clever adversaries speeds up Youtube page loads as it runs through its myriad checks that block Invidious.
I only do coding & browsing so maybe I'm a weak example but I find this even with my pretty old Intel laptops these days.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
Depends. I used to offload almost all compilation tasks, but now I only really do this if it's especially large. If I want to update my NixOS configuration I don't bother offloading it anymore. (NixOS isn't exactly Gentoo or anything, but I do have some overrides that necessitate a decent amount of compilation, mainly dogfooding my merge requests before they get merged/released.)
YMMV.
> If Qualcomm's Windows on ARM efforts live past the ARM lawsuit
FWIW, Qualcomm cancelled orders of its Windows devkit and issued refunds before the lawsuit. That is probably not a good sign
I am on Intel TGL currently and can't wait for Strix Halo next year. That is truly something else, it's nothing we have seen in notebooks before iGPU wise.
I've had a couple of Tiger Lake laptops, a Thinkpad and I believe my Surface Laptop 4. Based on my experience with current AMD mobile chipsets, I can only imagine the Strix Halo will be quite a massive uplift for you even if the generational improvements aren't impressive.
I've owned an M1 MBP base model since 2021 and I just got an M3 Max for work. I was curious to see if it "felt" different and was contemplating an upgrade to M4. You know what? It doesn't really feel different. I think my browser opens about 1 second faster from a cold start. But other than that, no perceptible difference day to day.
> It doesn't really feel different.
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
FWIW I think that's more the core count than anything. I have a M1 Max as a personal machine and an M3 Max at work and while the M3 Max is definitely faster, it isn't world-beating.
My current work machine is M1 Max 64Gb and it's the fastest computer I've ever used. Watching rust code compile makes me laugh out loud it's so quick. Really curious what the newer ones are like, but tbh I don't feel any pressure to upgrade (could just be blissfully ignorant).
I very much enjoy being able to start compilation and just seeing results fly by.
I think most of that difference is going to be the huge increase in performance core count between the base chip and the Max (from 4 to 12). The RAM certainly doesn't hurt though!
This is how I feel about the last few iPhones as well
I upgraded from a 13 pro to a 15 pro expecting zippier performance and it feels almost identical if not weirdly a bit slower in rendering and typing
I wonder what it will take to make Mac/iOS feel faster
> I upgraded from a 13 pro to a 15 pro expecting zippier performance and it feels almost identical if not weirdly a bit slower in rendering and typing
I went from an iPhone 13 mini to an iPhone 16 and it's a significant speed boost.
I went from 12 to 15 pro max, the difference is significant. I can listen to Spotify while shooting from the camera. On my old iPhone 12, this is not possible.
I think that says more about Spotify than your phone.
Test Spotify against YouTube Music (and others) - I personally see no reason for Spotify when I have YouTube Premium, which performs with less overhead.
Maybe they have friends and family on Spotify
I’m sure you’re right but that’s pretty unreal.
> I wonder what it will take to make Mac/iOS feel faster
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
16 pro has a specialized camera button which is a game changer for street / travel photography. I upgraded from 13 pro and use that. But no other noticeable improvements. Maybe Apple intelligence summarizing wordy emails.
I think the only upgrade now is from a non-Pro to Pro, since a 120Hz screen is noticeably better than a 60Hz screen (and a borderline scam that a 1000 Euro phone does not have 120Hz).
The new camera button is kinda nice though.
> The new camera button is kinda nice though.
I was initially indifferent about the camera button, but now that I'm used to it it's actually very useful.
I upgraded my iPhone 13 pro to the 16 pro and it was overall really nice - but it was the better use of hardware, the zoom camera, etc.
The CPU? Ah, never really felt a difference.
XR to 13, as I don't want the latest and didn't want to loose my jailbreak.
Infuriated by the 13.
The 3.5mm audio thunder bolt adapters disconnect more often than usual. All I need to do is tap the adapter and it disconnects.
And that Apple has now stopped selling them is even more infuriating, it's not a faulty adapter.
I realize this isn't your particular use case. But with newer iPhones, you can use USB-C directly for audio. I've been using the Audio Technica ATH-M50xSTS for a while now. The audio quality is exceptional. For Slack/Team/Zoom calls, the sidetone feature plays your voice back inside the headphones, with the level being adjustable via a small toggle switch on the left side. That makes all the difference, similar to transparency/adaptive modes on the AirPod Pro 2s (or older cellphones and landlines).
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
[0] https://www.audio-technica.com/en-us/ath-m50xsts [1] https://www.amazon.com/Adapter-Anker-High-Speed-Transfer-Not...
Adguard is on the Play Store. Netguard as well.
> Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
I have custom scripts, Ad blocking without VPNs, Application firewalls.
I enjoy having most-full control of my device.
> The 3.5mm thunder bolt adapters
The what? is this the adapter for 3.5mm headphones? If so, you don't have to get Apple made dongles. Third parties make them also.
Or just buy the actual Apple adapter from any number of other vendors. Best Buy still has plenty in stock, for instance.
I'd guess the GPs actual problem is lint in the Lightning port though. Pretty common, relatively easy to clean out too, especially compared to USB-C.
I'm in the EU. Third party ones cost the same as authentic Apple ones. If not more.
Regardless of either, they both have the same fault.
The connector between the phone and the adapter is poor. It could just be a fault with my phone but I have no way of proving this.
Third party ones are almost certainly not as good as the actual Apple ones. The Apple one has remarkably good quality for its price.
I suspect this sounds like a problem with your specific phone. Never had a problem with any lightning accessories myself.
Yes, which have the same fault as Apple authentic adapters which cost the same amount if not more.
It’s probably because of the jailbreak.
How would that woller out his port?
I've found compile times on large C++ code bases to be the only thing I really notice improving. I recently upgraded my work machine from a 2017 i7 to a shiny new Ryzen 9 9950x and my clean compile times went from 3.5 minutes to 15 seconds haha. When I compile with an M2 Max, it's about 30s, so decent for a laptop, but also it was 2x the price of my new desktop workstation.
The biggest difference I’ve seen is iPad Sidecar mode works far more reliably with the M3 Max than the M1 Max. There have been incremental improvements in speed and nits too, but having Sidecar not randomly crash once a day once on M3 was very nice.
Can confirm. I have an M2 Air from work and an M1 Pro for personal, and tbh, both absolutely fly. I haven't had a serious reason to upgrade. The only reason I do kind of want to swap out my M1 Pro is because the 13" screen is a wee small, but I also use the thing docked more often than not so it's very hard to justify spending the money.
On the other side, as someone doing a lot of work in the GenAI space, I'm simultaneously amazed that I can run Flux [dev] on my laptop and use local LLMs for a variety of tasks, while also wishing that I had more RAM and more processing power, despite having a top of the line M3 max MBP.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
I would love to hear more about what exactly you think will be disruptive. I don’t know the LLM world very well.
> I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP.
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
> I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
That makes sense from your workflow needs.
I try to avoid docker in general during local dev and luckily it has worked out for me even with microservice architectures. It reduces dramatically CPU and RAM needs and also reduces cycle time.
Looked at it but ruled out the Air due to lack of ports and limited RAM upgrades.
Pretty sure Air displays don't support HDR, are they really brighter?
I am not sure. I notice a difference. Maybe it is just screen age related?
They supposedly have the same base brightness (500 nits), with Pro allowing up to 1000 in HDR mode (and up to 1600 peak).
Air doesn't support 120Hz refresh either.
There's an app that allows to unlock max brightness on Pros (Vivid)[0] even without HDR content (no affiliation).
HDR support is most noticeable when viewing iPhone photos and videos, since iPhones shoots in HDR by default.
[0] https://www.getvivid.app
I just looked at it again side by side and I think they are actually the same. Not sure why I earlier thought they were different.
On a tangent, if I have a M3 pro laptop how do I test HDR? Download a test movie from where, play it with what?
I may or may have not seen HDR content accidentally, but I’m not sure.
You can search for videos on YouTube and filter by HDR. Apple TV shows are typically in HDR (Dolby Vision). Here are a couple of examples:
[0] Hawaii LG Demo: https://www.youtube.com/watch?v=WBJzp-y4BHA [1] Nature Demo: https://www.youtube.com/watch?v=NFFGbZIqi3U
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
[2] https://www.benq.com/en-us/knowledge-center/knowledge/bt2020... [3] https://encyclopedia.pub/entry/32320 (see section 4) [4] https://www.getvivid.app/
> With your M3 Pro, the HDR label should appear in Chrome and Safari.
Ahh. Not Firefox, of course.
Thanks, I just ran a random nature video in Safari. It was pretty. The commercials before it were extremely annoying though. I don't think it's even legal here to have so many ads per minute of content as Google inserts on youtube.
You can just search for HDR videos in Youtube.
the Air doesn't have ProMotion right? that feature is non-negotiable on any display for me nowadays
For me faster refresh rate is noticeable on phone or ipad where you scroll all the time. On a laptop you don't have that much smooth scrolling. For me it's a non issue on laptop, not even once I wished it had faster refresh. While I always notice when switching between Pro and non Pro iPad.
I have ProMotion on my MBP and iPhone but… it’s ok? Honestly, I use an older computer or iPhone temporarily and don’t notice a difference.
I’m looking forward to the day I notice the difference so I can appreciate what I have.
I find 60Hz on the non-Pro iPhone obnoxious since switching to 120Hz screens. On the other hand, I do not care much about 60Hz when it comes to computer screens. I think touch interfaces make low refresh rates much more noticeable.
I wonder. Do you do a lot of doom scrolling?
I can’t understand the people who notice the 120 hz adaptive refresh whatever and one guess is their use is a lot twitchier than mine.
No doomscrolling at all. Even when switching between home screens is like it's dropping frames left and right (it's not of course, but that's what it looks like coming from 120Hz). A Galaxy A54 that we still have in the house that was just over 300 Euro feels much smoother than my old iPhone 15 that cost close to 1000 Euro because it has a 120Hz screen.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
The Galaxy must be new. In my experience Android phones get extremely laggy [1] as they get old and the 120 Hz refresh won't save you :)
I just noticed that I don't really try to follow the screen when I scroll down HN, for example. Yes it's blurry but I seem not to care.
[1] Source: my Galaxy something phone that I keep on my desk for when I do Android development. It has no personal stuff on it, it's only used to test apps that I work on, and even that isn't my main job (nothing since early spring this year for example). It was very smooth when I bought it, now it takes 5+ seconds to start any application on it and they stutter.
A lot of my work can be easily done with a Celeron - it's editing source, compiling very little, running tests on Python code, running small Docker containers and so on. Could it be faster? Of course! Do I need it to be faster? Not really.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
I still use Ivy Bridge and Haswell workstations (with Linux, SSD and discrete GPU) as my daily drivers and for the things I do they still feel fast. Honestly a new Celeron probably beats them performance wise.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
How's the performance of Gmail on the Celeron? That's always my sticking point for older computers. The fancy web applications really drag.
Not great. Works well with Thunderbird or Evolution though.
And yes. Web apps are not really great on low-spec machines.
My 2019 i9 flagship MBP is just so, so terrible, and my wife's M1 MacBook Air is so, so great. I can't get over how much better her computer is than mine.
It's so nice being able to advise a family member who is looking to upgrade their intel Mac to something new, and just tell them to buy whatever is out, not worry about release dates, not worry about things being out of date, and so on.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
M3 Air with 16gb (base config as of today) is potentially a decade’s worth of computer. Amazing value.
Base 16gb is absolutely wild. My base m2 air with 8gb is almost enough to handle anything I’d ever want it to without zero slowdown.
A 16gb model for about a thousand bucks?? I can’t believe how far macbooks have come in the last few years
I think this is confirmed by the fact software vendors are still not taking advantage of ARM chips maximum performance.
Where this might shift is as we start using more applications that are powered by locally running LLMs.
I would normally never upgrade so soon after getting an M1 but running local LLMs is extremely cool and useful to the point where I'd want the extra RAM and CPU to run larger models more quickly.
I'm bumping from a still-excellent M1 MAX / 64GB to M4 MAX / 128GB, mostly for local GenAI. It gives me some other uplift and also enables me to sell this system while it's still attractive. I'm able to exhaust local 7B models fairly easily on it.
I have a 64gb M1 Max and already do that
but yes, I was looking at and anticipating the max RAM on the M4 as well as the max memory speed
128gb and 546GB/s memory bandwidth
I like it, I don't know yet on an upgrade. But I like it. Was hoping for more RAM actually, but this is nice.
I dont think this has anything to do with the hardware. I think we have entered an age where users in general are not upgrading. As such, software can't demand more and more performance. The M1 came out at a time where mostly all hardware innovation had staggered. Default RAM in a laptop has been 16G for over 5 years. 2 years ago, you couldn't even get more than 16 in most laptops. As such, software hardware requirements havent changed. So any modern CPU is going to feel overpowered. This isn't unique to M1's.
That’s because today’s hw is perfectly capable of running tomorrow’s software at reasonable speed. There aren’t huge drivers of new functionality that needs new software. Displays are fantastic, cellular speeds are amazing and can stream video, battery life is excellent, UIs are smooth with no jankiness, and cameras are good enough.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Until everything starts running local inference. A real Siri that can operate your phone for you, and actually do things like process cross-app conditions ("Hey Siri, if I get an email from my wife today, notify me, then block out my calendar for the afternoon.") would use those increased compute and memory resources easily.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
That's a very big maybe. The LLM experience locally is currently very very different from the hosted models most people play with. The future is still very uncertain.
I standardized on 16gb for my laptops over 10 years ago. I keep a late 2013 MBP with 16 for testing projects on, separate from my main Linux box.
Getting an extra five years of longevity (after RAM became fixed) for an extra 10% was a no-brainer imho.
I upgraded from the last 16" MBP Intel sold to the first 16" MBP M1 available.
It is absolutely, 100%, no doubt in my mind: the hardware.
Yep, the same, M1 Pro from 2021. It's remarkable how snappy it still feels years later, and I still virtually never hear the fan. The M-series of chips is a really remarkable achievement in hardware.
> Perhaps it’s my age
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
I agree with you about not needing to upgrade but, it still stands that IMHO Apple is better off with upgrading or even having the need to upgrade with competition. (Also it's really good that Macs now have 16GB of ram by default). As I have had my M1 14.2 Max I believe that the only reason I would want to upgrade is that I can configure it with 128GB of ram which allows you to load newer AI models on device.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
Work just upgraded my M1 Pro to M3 Pro and I don't notice any difference except for now having two laptops.
I've had Macs before, from work, but there is something about the M1 Pro that feels like a major step up.
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
I bought my M1 Pro MBP in 2021. Gave it 16G of RAM and a 1TB HD. I plan to keep it until circa 2031.
Same. I used to upgrade every 1.5 years or so. But with every Apple Silicon generation so far I have felt that there are really no good reasons to upgrade. I have a MacBook M3 Pro for work, but there are no convincing differences compared to the M1 Pro.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
Guess that’s why most of their comparisons are with the older Intel Macs.
And M1 from 4 years ago instead of M3 from last year; while a 2x speed improvement in the benchmarks they listed is good, it also shows that the M series CPUs see incremental improvements, not exponential or revolutionary. I get the feeling - but a CPU expert can correct me / say more - that their base design is mostly unchanged since M1, but the manufacturing process has improved (leading to less power consumption/heat), the amount of cores has increased, and they added specialized hardware for AI-related workloads.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
Their sales pitch when they released the M1 was that the architecture would scale linearly and so far this appears to be true.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
Apple updates their microarchitecture with each bump.
I have a 2009 and a 2018 Windows laptops.
The only reason the 2009 one now gets little use, is its motherboard now has some electronic issues, otherwise it would serve me perfectly well.
Same feeling. The jump from all the previous laptops I owned to an M1 was an incredible jump. The thing is fast, has amazing battery life and stays cold. Never felt the need to upgrade.
I have an MBP M1 Max and the only time I really feel like I need more oomph is when I'm doing live previews and/or rendering in After Effects. I find myself having to clear the cache constantly.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
probably the next update wave is coming from the need of AI features for more local memory and compute. The software is just not there yet in usual tasks but it's just a question of time I guess. Of course there will be the pressure to do that in the cloud as usual, but local compute will always remain a market.
and probably it's good that at least one of the big players has a business model that supports driving that forward
I think regretting Mac upgrades is a real thing, at least for me. I got a 32G Mac mini in January to run local LLMs. While it does so beautifully, there are now smaller LLMs that run fine on my very old 8G M1 MacBook Pro, and these newer smaller models do almost all of what I want for NLP tasks, data transformation, RAG, etc. I feel like I wasted my money.
Small models retain much less of the knowledge they were trained on, especially when quantized.
One good use case for 32gb Mac is being able to run 8b models at full precision, something that is not possible with 8-16gb macs
Or better run quantized 14B or even 32B models...
You can sell it, get most of your money back.
Which ones in particular? I have an M2 air with 8GB, and doing some RAG development locally would be fantastic. I tried running Ollama with llama3.2 and it predictably bombed.
Out of curiosity and also because I'm wondering which specification to potentially buy in the future, how much RAM does your MBP have?
I feel the same way about my M1 Macbook Air ... it's such a silly small and powerful machine. I've got money to upgrade, I just have no need. It's more than enough for even demanding Logic sessions and Ollama for most 8b models. I love it.
> Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
I think the last decade had an explosion in the amount of resources browsers needed and used (partly workloads moving over, partly moving to more advanced web frameworks, partly electron apps proliferating).
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
I have exactly the same experience, usually after 3 years I'm desperate for new Mac but right now I genuinely think I'd prefer not to change. I have absolutely no issues with my M1 Pro, battery and performance is still great.
But this ad is specifically for you! (Well, and those pesky consumers clinging on to that i7!):
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
The only reason I'd want to upgrade my M1 Pro MBP is because I kind of need more RAM and storage. The fact that I'm even considering a new laptop just for things that before could have been a trivial upgrade is quite illuminating.
100% agree on this. Ive had this thing for 3 years and I still appreciate how good it is. Of course the M4 tingles my desire for new cool toys, but I honestly don´t think I would notice much difference with my current use.
I feel exactly the same. The one thing that would get me to pull the trigger on a newer one is if they start supporting SVE2 instructions, which would be super useful for a specific programming project I’ve been playing with.
Agreed. Also rocking a M1 Pro MBP and can’t see myself replacing it until it dies
The M1 series was too good. Blows Intel Macs out of the water. But I still have an M1 Max. It’s fantastic.
I have the same feeling performance-wise with the laptop I bought in 2020 with a Ryzen 7 4800H.
But it's a heavy brick with a short battery life compared to the M1/2/3 Mac.
My 2019 Intel MBP is getting long in the tooth. These M4 Pros look great to me.
The base model is perfect. Now to decide between the M3/M4 Air and the M4 Pro.
I’m using the M3 Air 13 in (splurged for 24 GB of RAM, I’m sure 16 is fine) to make iOS apps in Xcode and produce music in Ableton and it’s been more than performant for those tasks
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
Same for me. The only reason to replace it, is that my M1 pro’s SSD or battery will go bad or if I accidentally drop the machine and something breaks.
I am replacing a Dell laptop because the case is cracking, not because it's too slow (it isn't lightning fast, of course, but it sure is fast enough for casual use).
I replaced my M1 Air battery last year and it's still going like a champ. $129 for another 3 years of life is a bargain.
Same. I have an M1 Max 64GB. It has great battery life and I never feel myself waiting on anything. Such an amazing computer all around.
Same. The upgrade from my Intel MBP to the M1 Pro 2011 was huge, but I haven't felt the need to upgrade at all.
I got 6+ years out of my last intel MacBook Pro and expect at least the same from my M1 Max. Both have MagSafe and hdmi output :)
This is how it feels to own a desktop computer.
Tbf, the only thing I miss with my M2 MacBook is the ability to run x86_64 VM’s with decent performance locally.
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
I expect this trend to begin reversing as we start getting AI models that are intended to run locally.
Interesting, I have a M2 Pro Mac Mini and I hit limits literally every day
All hardware has limits. Which ones are you hitting every day?
Yup, honestly the main reason I'd like to upgrade from my M1 MBA is the newer webcams are 1080p instead of 720p, and particularly much better in low light like in the evening.
Has nothing whatsoever to do with CPU/memory/etc.
If you're in the ecosystem get an iphone mount - image quality is unreal compared to anything short of some fancy DSLR setup - it is some setup but not much with magnets in iphone.
Ditto... will probably upgrade when the battery is dead !
when the hardware wait time is the same as the duration of my impulsive decisions i no longer have a hardware speed problem, i have a software suggestion problem
I got an MBP M1 with 32gb of RAM. It'll probably be another 2-3 years or longer before I feel the pressure to upgrade if not longer. I've even started gaming (something I dropped nearly 20 years ago when I switched to mac) again due to Geforce Now, I just don't see the reason.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
I hate to say it but that's like a boomer saying they never felt the need to buy a computer, because they've never wished their pen and paper goes faster. Or a UNIX greybeard saying they don't need a Mac since they don't think its GUI would make their terminal go any faster. If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade. A Macbook M1 can't run half the stuff posted on Hugging Face these days. Even my 128gb Mac Studio isn't nearly enough.
> If you've hit a point in your life where you're no longer keeping up with the latest technological developments like AI, then of course you don't need to upgrade.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
I think the difference is that AI is a very narrow niche/hobby at the moment. Of course if you're in that niche having more horsepower is critical. But your boomer/greybeard comparisons fall flat because they're generally about age or being set in your ways. I don't think "not being into AI image generation" is (currently) about being stuck in your ways.
To me it's more like 3d printing as a niche/hobby.
on ai being a niche/hobby at the moment... feels like something a unix greybeard would say about guis in the late 70s...
Playing with them locally? Yes, of course it's a niche hobby. The people doing stuff with them that's not either playing with them or developing not just an "AI" product, but a specific sort of AI product, are just using ChatGPT or some other prepackaged thing that either doesn't run locally, or does, but is sized to fit on ordinary hardware.
< 1% of all engagement with a category thing is niche/hobby, yes.
I get that you're probably joking, but - if I use Claude / ChatGPT o1 in my editor and browser, on an M1 Pro - what exactly am I missing by not running e.g. HF models locally? Am I still the greybeard without realising?
Privacy? Lots of companies do not allow using public chatbots for anything proprietary.
It's like asking what you're missing by not using Linux if you're using Windows.
It's something a regular person would say to a Unix greybeard, which in and of itself was always and still is a very niche hobby.
Or what a prokaryote would say about eukaryotes.
Seems like we've reached the "AI bro" phase...
Using the term "bro" assumes that all AI supporters are men. This erases the fact that many women and nonbinary people are also passionate about AI technology and are contributing to its development. By using "AI bro" as an insult, you are essentially saying that women and nonbinary people are not welcome in the AI community and that our contributions don't matter. https://www.reddit.com/r/aiwars/comments/13zhpa7/the_misogyn...
Is there an alternative term you would prefer people to use when referring to a pattern of behavior perceived as a combination of being too excited about AI and being unaware (perhaps willfully) that other people can be reasonably be much less interested in the hype? Because that argument could definitely benefit from being immune to deflections based on accusations of sexism.
When I see that someone is excited about something, I believe in encouraging them. If you're looking for a more polite word to disparage people who love and are optimistic about something new, then you're overlooking what that says about your character. Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
> Also AI isn't just another fad like NFTs and web3. This is it. This is the big one.
That's thoroughly unconvincing. That kind of talk is exactly what so many people are tired of hearing. Especially if it's coming from technically-minded people who don't have any reason to be talking like PR drones.
What makes you think I care about convincing you? These days every shot caller on earth is scrambling to get piece of AI. Either by investing in it or fighting it. You come across as someone who wants to hate on AI. Haters aren't even players. They're NPCs.
So people who aren't obsessed with AI to your liking are:
- boomer luddites
- primitive single-celled organisms
- NPCs
And even people who are enthusiastic about AI but aren't fanatical about running it locally get scorn from you.
I can understand and forgive some amount of confirmation bias leading you to overestimate the importance and popularity of what you work on, but the steady stream of broad insults at anyone who even slightly disagrees with you is dismaying. That kind of behavior is wildly inappropriate for this forum. Please stop.
You know, getting upset about being called an AI bro rings a lot more hollow if you go around calling people NPCs.
Huh?
How old are you?
"Bro" has been gender neutral for over a decade. Males and females under the age of 25 call each other "bro" all the time.
That’s interesting because I would’ve thought having strong local compute was the old way of thinking. I run huge jobs that consume very large amounts of compute. But the machines doing the work aren’t even in the same state I’m in. Then again maybe I’m even older as I’m basically on the terminal server / mainframe compute model. :)
I work with AI models all day every day, keep up with everything, love frontier tech, I love and breathe LLMs. And I, like OP, haven't seen the need to upgrade from the M1 MBP because it runs the small 1-7B models just fine, and anything bigger I want on some GPU instance anyway, or I want a frontier model which wouldn't run on the newest and biggest MBP. So it's not just us Boomers hating on new stuff, the M series MacBooks are just really good.
I fully support using Macbooks as a thin client into a better computer. So long as it's your computer.
> A Macbook M1 can't run half the stuff posted on Hugging Face these days.
Example?
LLaMA 3.1 405B
Given that models are only going to get larger, and the sheer amount of compute required, I think the endgame here is dedicated "inference boxes" that actual user-facing devices call into. There are already a couple of home appliances like these - NAS, home automation servers - which have some intersecting requirements (e.g. storage for NAS) - so maybe we just need to resurrect the "home server" category.
I agree, and if you want to have the opportunity to build such a product, then you need a computer whose specs today are what a home server would have in four years. If you want to build the future you have to live in the future. I'm proud to make stuff most people can't even run yet, because I know they'll be able to soon. That buys me time to polish their future and work out all the bugs too.
I thought LLaMA 3.1 405B was a relatively huge model. Is the size of this model really typical of half the models you find on Hugging Face these days?
you could not say this better than this.
So every user of a computer that doesn't create their own home-grown ML models is a boomer? This can't possibly be a generational thing. Just about everyone on the planet is at a place in their life where they don't make their own AIs.
Eventually as the tools for doing it become better they'll all want to or need to. By then, most computers will be capable of running those tools too. Which means when that happens, people will come up another way to push the limits of compute.
I don't think there's any sort of processor for the last 10 years. It really makes me feel like I need to upgrade.
What I do know is that Linux constantly breaks stuff. I don't even think it's treading water. These are interfaces are actively getting worse.
I also have an M1 Pro MBP and mostly feel the same. The most tempting thing about the new ones is the space black option. Prior to the M1, I was getting a new laptop every year or two and there was always something wrong with them - butterfly keyboard, Touch Bar etc. This thing is essentially perfect though, it still feels and performs like a brand new computer.
Same boat—I'm on a lowly M1 MacBook Air, and haven't felt any need to upgrade (SwiftUI development, video editing, you name it), which is wild for a nearly 4 year-old laptop.
Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
Apple equipment always last a long time and retain value on the second-hand market.
Not true. Look at how little supercharged intel apples are going for in Facebook marketplace.
The quality stuff retains value, not brand.
Comparing against the intel era is a bit apples (excuse me) to oranges. Technical generation gaps aside, Apple products hold value well.
So the intel era is not Apple products? Butterfly keyboard is not an Apple invention?
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
Again, the quality holds the value not the brand.
It's expected Intel-based Macs would lose value quickly considering how much better the M1 models were. This transition was bigger than when they moved from PowerPC to Intel.
One complicating factor in the case of the Intel Macs is that an architectural transition happened after they came out. So they will be able to run less and less new software over the next couple of years, and they lack most AI-enabling hardware acceleration.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
Similar for me. MacBook Air M1 (8 cpu / 8 gpu; 16 GB RAM)...running in or out of clamshell with a 5k monitor, I rarely notice issues. Typically, if I'm working very inefficiently (obnoxious amount of tabs with Safari and Chrome; mostly web apps, Slack, Zoom, Postman, and vscode), I'll notice a minor lag during a video call while screen sharing...even then, it still keeps up.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
> Yeah, I feel like Apple has done the opposite of planned obsolescence with the M chips.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
Does anyone know of any good deals on the older models of apple laptops? Now is usually a great time to purchase (a still very capable) older model.
Most retailers have had the older models on closeout for a few weeks now. Best Buy, Amazon and Costco have had the M3 models for a few hundred off depending on models.
Watch SlickDeals. I think it was this time last year where lots of refurbs/2 generation old machines were going for massive discounts. Granted they were M1 machines, but some had 64GB RAM and 4TB drives for like $2700. Microcenter and B&H are good ones to watch as well.
The M-series macbooks depreciate in value far slower than any of the Intel models. M1 base models can still sell for nearly $1k. It's difficult to find a really good deal.
The refurbished store is always a good place to have a look through.
I wish Apple would let me max out the RAM on a lower performance chip. That’s more valuable to me than more compute.
I think it's just one of the tradeoffs of having everything on one SOC. They can only realistically and efficiently make so many versions.
Question without judgement: why would I want to run LLM locally? Say I'm building a SaaS app and connecting to Anthropic using the `ai` package. Would I want to cut over to ollama+something for local dev?
Data privacy-- some stuff, like all my personal notes I use with a RAG system, just don't need to be sent to some cloud provider to be data mined and/or have AI trained on them
For me it is consistency. I control the model and the software so I know a local LLM will remain exactly the same until I want to change it.
It also avoids the trouble of using a hosted LLM that decides to double their price overnight, costs are very predictable.
Lack of censorship.
Does anyone understand this claim from the press release?
> M4 Max supports up to 128GB of fast unified memory and up to 546GB/s of memory bandwidth, which is 4x the bandwidth of the latest AI PC chip. This allows developers to easily interact with large language models that have nearly 200 billion parameters.
Having more memory bandwidth is not directly helpful in using larger LLM models. A 200B param model requires at least 200GB RAM quantized down from the original precision (e.g. "bf16") to "q8" (8 bits per parameter), and these laptops don't even have the 200GB RAM that would be required to run inference over that quantized version.
How can you "easily interact with" 200GB of data, in real-time, on a machine with 128GB of memory??
q4 or q5 quantization.
Edit: Actually you'd want q3 to fit a 200B model into 128GB of RAM. e.g. this one is about 140GB https://huggingface.co/lmstudio-community/DeepSeek-V2.5-GGUF...
Wouldn't it be incredibly misleading to say you can interact with an LLM, when they really mean that you can lossy-compress it to like 25% size where it becomes way less useful and then interact with that?
(Isn't that kind of like saying you can do real-time 4k encoding when you actually mean it can do real-time 720p encoding and then interpolate the missing pixels?)
Still, no matter how much you are willing to spend, you cannot buy a MacBook Pro with an LTE modem, like the ones in the iPhone, iPad, and Watch.
Tethering to an iPhone is so easy though - just select it in the Wifi menu. I'm not sure if I'd ever pay for an LTE modem option. I'm sure it would be better efficiency and performance to have it built-in, but I wouldn't think many people care enough about that small difference to offer it as an option.
I use the tethering quite often. I have for years. It is flaky and burns two batteries instead of one. I agree that many people do not care. Some of us who are traveling a lot are willing to pay for more options.
It's not about efficiency or performance, it's about not having to own the iPhone in the first place. Just put a SIM card inside the laptop and forget about it. Windows laptops can even seamlessly switch between wifi and LTE depending on which one is available. But of course Apple would never allow that because they want to force you to own the full set of Apple devices. Laptop being self-sufficient would be against their policy.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
I wonder if one of the obstacles is the amount of data that would likely be used.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
Integrating an LTE modem into an iPad requires a mobile data plan, and thats about it. It's not "tethered" if its built into the device.
I've always heard that patent disputes were at the root of the lack of a modem option. Apple had a prototype MacBook Pro back in the early Intel days IIRC but it was never released.
Maybe if Apple ever gets their in-house modems working, we'll see them on all of the product lines, but until then, it's a niche use case that likely isn't causing them to lose a ton of sales.
> It's not "tethered" if its built into the device.
I understand that. My point is that I think an LTE modem in a laptop might reasonably use far more data than an LTE modem in a phone or tablet. Most people who download and/or upload very large files do so on their computer rather than their mobile devices.
Dell laptops can be configured with LTE modems.
There is no reason macOS cannot have some option for throttling usage by background updates when connected over LTE. iPads have an LTE option.
That carriers have not figured out how to charge me by the byte over all my devices instead of per device is really not a big issue to me. I would like to pay for an LTE modem and the necessary bandwidth.
My intuition is that when Apple has their own LTE modem and is not dependent on Qualcomm, a MacBook Pro will have an option similar to that for Dell power users.
The industry as a whole is trying its best to not rely on Qualcomm, given its extremely litigious past. Apple already tried once to avoid using their chips for the iPhone's modem, which I seem to recall failed. When it comes to devices for enterprise, it's less of a gamble because the cost can be passed on to orgs who are less price sensitive.
I think the biggest obstacle is the Qualcomm patents. There is no good reason why a MacBook Pro cannot have a feature that Dells have.
Can someone please help me out with this? I'm torn between Mac mini and and MacBook Pro, specifically the CPU spec difference.
MBP: Apple M4 Max chip with 16‑core CPU, 40‑core GPU and 16‑core Neural Engine
Mac mini: Apple M4 Pro chip with 14‑core CPU, 20‑core GPU, 16-core Neural Engine
What kind of workload would make me regret not having bought MBP over Mac mini given the above. Thanks!
Doesn't it make a bigger difference that one of them is a laptop and one of them is a mini computer that you have to leave plugged in?
Since the only real difference is number of GPUs, it'd be:
- photo/video editing
- games, or
- AI (training / inference)
that would benefit from the extra GPUs.
The Max also has much more memory bandwidth. Especially for running local LLMs, that is the limiting factor much more than the number of GPU cores is.
^3D work - Maya, Blender, etc. Probably would be best on a Studio or workstation if/when those are available again.
For normal web dev, any M4 CPU is good as it is mostly dependent on single core speed. If you need to compile Unreal Engine (C++ with lots of threads), video processing or 3D rendering, more cores is important.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
- Desktop with lots of cores: Mac Studio
- Desktop for single core: Mac mini
I really enjoy my MacBook Air M3 24GB for desktop + mobile use for webdev: https://news.ycombinator.com/item?id=41988340
The single most annoying thing about this announcement for me is the fact that I literally just paid for an Asus ProArt P16 [0] on the basis that the Apple offerings I was looking at were too expensive. Argh!
[0] https://au.store.asus.com/proart-p16-h7606wi-me124xs.html
Trying to find how many external displays the base model supports. Because corps almost always buy the base model #firstworldproblems
The base model doesn't support thunderbolt 5.
And the base model still doesn't support more than 2 external displays without the DisplaySync (not DisplayPort!) hardware+software.
Two displays with the lid open.
"The display engine of the M4 family is enhanced to support two external displays in addition to a built-in display."
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
https://www.apple.com/macbook-pro/specs/
"M4 and M4 Pro
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
My wallet is trembling.
On a side note, anyone know what database software was shown during the announcement?
What's the timestamp? At 3:43 there's Luna Modeler.
https://www.datensen.com/data-modeling/luna-modeler-for-rela...
Thanks. That's it. Coincidentally, I found out what it was by looking at the actual press release where they had a screenshot of it too.
> MacBook Pro with M4 Max enables:
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
The real question is. Can I plug two monitors to it?
You can. And use your laptop screen as the third one.
Does anyone know if there is a way to use Mac without the Apple bloatware?
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
There is zero iCloud account requirement. You do not need to use the App Store. Gatekeeper can be disabled with a configuration profile key. Telemetry (what little there is) can be disabled with a configuration profile key. Siri can be disabled, all of the generative AI crap can be disabled, yadda yadda yadda, with a configuration profile key. Every background service can be listed and disabled if you disable authenticated-root. Hell, you could disable `apsd` and disable all push notifications too, which require a phone home to Apple.
You don't need to use AppStore, unless of course you want to use apple software.
Pretty much all the software I use is from brew.
this ^^
IIRC Apple is a lot less heavy handed wrt service login requirements when compared to Microsoft’s most recent Windows endeavors. And depending on the developer you can get around having to use the App Store at all. Being you’re on an Intel Mac have you considered just using Linux ?
You can use OSX without an Apple account and paired with a 3rd party host based firewall (Little Snitch), the OS usually stays out of your way (imo). Bundled apps can be removed after disabling SIP (file integrity) but there are downsides/maintenance to that route.
At a linux conference I saw many macbooks. Talked to a few, they just ran linux in a VM full screen for programming and related. Then used OSX for everything else (office, outlook, teams, work enforced apps, etc). They seemed very happy and this encouraged them to not task switch as often.
Do you mean you want to use Apple Silicon without macOS?
If that's your question, yes - various options exist like https://asahilinux.org
You can totally use it without ever signing in to Apple account. You cannot delete Siri etc, but you can disable parts of it and not use the rest.
There used to be this whole contingent of people who were adamant that Apple's software was too opinionated, bloated, that you couldn't adapt its OS to your needs, and that Apple was far too ingrained in your relationship with your device. That Linux was true freedom, but at least that Windows respected its users
Then Windows 11 came out.
I belong to that contingent, and I still stand by the assertion that Apple's software is too opinionated, configurability is unreasonably low, and you have to stick to the Apple ecosystem for many thing to get the most out of it.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
Happened a lot earlier than 11.
You need to embrace Apple's vision, or use something else. Clearly your goals and Apple's are misaligned, so you will only feel pain when using a Mac.
Get a PC.
I gave up on macos when they started making the OS partition read-only. A good security feature in general, but their implementation meant that changing anything became a big set of difficulties and trade-offs.
That, combined with the icloud and telemetry BS, I'd had enough.
Not only good security, but it also makes software updates a lot faster because you don't have to check if the user has randomly changed any system files before patching them.
Upgraded to a M1 Pro 14 in December 2021, and I still rock it everyday for dev purpose. Apple does great laptop.
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
If you're still under AppleCare+, definitely give it a try before it expires.
Personal anecdote: don't get your hopes up. I've had my issues rejected as 'no fault found', but it's definitely worth spending a bit of time on.
> MacBook Air with M2 and M3 comes standard with 16GB of unified memory, and is available in midnight, starlight, silver, and space gray, starting at $999 (U.S.) and $899 (U.S.) for education.
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
No wifi 7? Are others shipping it?
Strange because their latest iPhones do have Wifi 7
Yup, Wi-Fi 7 devices have been shipping for over a year. My Odin 2 portable game console has Wi-Fi 7.
As it goes for the section where they demoed the assistance from apple intelligence to the researcher creating an abstract and adding pictures to their paper. Is it better or worse to do this? People are already complaining so heavily about dead internet theory with the 'AI voice' being so prominent..
Wonder how good are those for LLMs (compared to M3 Pro/Max)... They talk about the Neural Engine a lot in the press release.
I'm not sure we can leverage the neural cores for now, but they're already rather good for LLMs, depending on what metrics you value most.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
That ad reveal at the end. Someone in the marketing team must have started doing CrossFit
Hm, the M3 MacBook Pro had a 96GB of ram model (which is what I have). I wonder why it's not an option with the M4.
M2 pro has 256 bit wide memory, mostly benefiting the GPU perf.
M3 pro has 192 bit wide memory, GPU improvements mostly offset the decrease in memory bandwidth. This leads to memory options like 96GB.
M4 pro has 256 bit wide memory, thus the factor of 2 memory options.
The 96GB option was with the M2 Max and M3 Max chips, not the M2 Pro or M3 Pro.
DRAM chips don't just come in power of two sizes anymore. You can even buy 24GB DDR5 DIMMs.
It is interesting they only support 64gb and then jump to 128gb. It seems like a money play since it's $1,000 to upgrade for 128, and if you're running something that needs more than 64 (like LLMs?) you kind of have no choice.
No OLED yet :(
I'm waiting for OLED. Will purchase as soon as they do it.
Great I cant wait for the software bloat to expand and make these the only machines that feel fast
I have an M2 Max now, and it's incredible. But it still can't handle running xcode's Instruments. I'd upgrade if the M4s could run the leaks tool seamlessly, but I doubt any computer could.
Once they get a MacBook Air with an M4, it will become a viable option for developers and other users that want/need 2 external monitors. Definitely looking forward to that happening.
The M3 Air does support 2 but only with the lid closed
Announcement video as well
https://youtu.be/G0cmfY7qdmY?si=vbgIr8zn9EzB2Xam
The machine is great! How is its performance for AI model training? A lot of library and tools are not built for M series chip
Poor. My M3 Max/128GB is about 20x slower than 4090. For inference it's much better, still much slower than 4090 but it enables working with much larger LLMs albeit at ~10t/s (in comparison, Threadripper 2990WX/256GB does like 0.25t/s). M4 Max is likely going to be ~25% faster than M3 Max based on CPU perf and memory bandwidth.
No WiFi 7!
:/
New 12MP Center Stage Camera. Will it support 4k?
The 12MP will be used for better framing, there is still almost no use case for 4k quality video conferencing
It is truly sad how bad Zoom / Google Meet / Teams are when it comes to video quality.
I look at my local source vs the recording, and I am baffled.
After a decade of online meeting software, we still stream 480p quality it seems.
FaceTime has great quality. Unfortunately, as you age you start to hate the quality.
When I have a full team of people with 1080p webcams and a solid connection I can notice the quality. Most of the time not everyone fulfills those requirements and the orchestrator system has to make do
I mean you can easily create your own fully meshed P2P group video chat in your browser just using a little bit of JS that would support everyone running 4k, but it will fail the moment you get more than 3-8 people as each persons video stream is eating 25mbps for every side of a peer connection (or 2x per edge in the graph.)
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
I'm not asking for 4k.
I am asking for good 720p... With how static cam footage is it would be less than 8mbps probably.
4k for videoconferencing is nuts. The new camera should be an improvement over the old. Plus, being able to show your actual, physical desktop can be Andy too. Using your iPhone as the webcam will still probably give you the best quality especially if you are in a lower light situation.
I don't think so. They would have made that a huge deal.
Tech specs confirm only 1080p recording.
Disingenuous to mention the x86 based MacBooks as a basis for comparison in their benchmarks; they are trying to conflate current-gen Intel with what they shipped more than 4 years ago.
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
Lot's of people don't upgrade on the cadence that users on this forum do. Someone was mentioning yesterday that they are trying to sell their Intel Mac {edit: on this forum] and asking advice on getting the best price. Someone else replied that they still had a 2017 model. I spoke to someone at my job (I'm IT) who told me they'd just ordered a new iMac to replace one that is 11 years old. There's no smoke and mirrors in letting such users know what they're in for.
Yup, I'm a developer who still primarily works on a 2018 Intel Mac. Apple's messaging felt very targeted towards me. Looking forward to getting the M4 Max as soon as possible!
Oh, wow. You are in for a treat.
The only downside is that your computer will no longer double as a space heater :p
Indeed. The one positive feature of the 2019 MBP I briefly had to use was that my cat loved taking naps on it.
I have a 2013 Macbook Air as a casual browsing machine that's still going strong (by some definition of it) after a battery replacement.
Right, it's obviously that, not a marketing trick to make numbers look much bigger while comparing to old CPUs and laptops :)
Given that they also compare it to an M1 in the same aside, I'd say you're wrong.
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
Ben Bejarin said that around 50% of the installed base is still using Macs with Intel chips. You’ll keep hearing that comparison until that number goes down.
They are going to milk these horrendous crazy hot x86 thermally throttled macs performance comparisons for a decade.
It could see it as disingenuous, or a targeted message to those users still on those older x86 machines.
Exactly how I read it. I have an intel model, and the press release felt like a targeted ad.
Does anyone have benchmarks for the M4 Pro or M4 Max CPUs yet? Would love to see Geekbench scores for those.
I'm fighting the urge to get the M4 Pro model so bad right now.
At least it appears they didn't hide the power button on the bottom.
The base M4 Max only has an option for 36gb of ram!? They're doing some sus things with that pricing ladder again. No more 96gb option, and then to go beyond 48gb I'd have to spend another $1250 CAD on a processor upgrade first, and in doing so lose the option to have the now baseline 512gb ssd
I'd add that although I find it a bit dirty, the computers are obviously still amazing. It's just a bit bizarre that the lower spec cpu offers the customer the option to change the ram quantity. More specifically, going from the M4 Pro to the M4 Max removes the option to change the ram from 36gb, whereas sticking with the Pro lets you select 48gb or 24gb, unless you choose the max Max. If I pre-order the Mac Mini with the same processor, I can select 64gb for the insane price of an additional $750cad, but it's just not available on the macbook pro M4 Pro.
It would indeed have been nice to see a faster response rate screen, even though I value picture quality more, and it also would have been nice to see even vaguely different colors like the iMac supposedly got, but it seems like a nice spec bump year anyway.
I think any idea that Apple doesn't thoroughly understand the capacity, value, market, price tradeoff is untenable.
The most obvious view is that Apple price gouges on storage. But this seems too simplistic.
My conjecture is that there's an inescapable tension between supply (availabilty/cost) sales forecasts, technological churn, and roadmaps that leads them to want to somewhat subsidize the lowest end, and place a bit of back-pressure on consumption at the high-end. The trick is finding the tipping point on the curve between growth and over commitment by suppliers. Especially, for tightly vertically integrated products.
The PC industry is more diffuse and horizontal and so more tolerant of fluctuations in supply and demand across a broader network of providers and consumers, leading to a lower, more even cost structure for components and modules.
In real terms, Apple's products keep costing less, just like all computer products. They seem to make a point of holding prices on an appearance point of latest tech that's held steady since the first Macs: about $2500 for a unit that meets the expectations of space right behind the bleeding edge while being reliable, useful and a vanguard of trends.
Seems plausible enough to me, but whether there's a business case or not isn't my concern as much as how it feels to price something out knowing that I'm deliberately gouged on arbitrary components instead of the the segmentation being somewhat more meaningful. They're already reaping very high margins, but by tightly coupling quantities of those components to even higher margin builds, it feels a bit gross, to the point where I just have to accept that I'd have to spend even more excessively than in previous years of similar models. As in, I'm happy to pay a bit more for more power if I find it useful, likewise with ram, but not being able to get more ram without first getting something I have no way to put to use seems a bit silly, akin to not being able to get better seats in a car unless I first get the racing spec version, otherwise I'm stuck with a lawn chair.
The notch makes me sad.
You can just turn it off. macOS lets you change the resolution to use just the screen below the notch, and because it's mini-LED, the now unused "flaps" to the sides of the notch are indistinguishable from the rest of the bezel.
It's really two extra flaps. Feel better now?
Does it still come with a crappy 1 year warranty?
Damn, I just bought an M3
Can return it if you bought it recently.
Finally they're doing starting memory at 16gb.
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
My hypothesis is Apple is mostly right about their base model offerings.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
It's just a tactic to get a higher average price while being able to advertise a lower price. What makes it infuriating is memory is dirt cheap. That extra 8GB probably costs them $10 at most, but would add to utility and longevity of their hardware quite a bit.
They are supposed to be "green" but they encourage obsolescence.
They align need with more CPU and margin. Apple wants as few SKUs as possible and as much margin as possible.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
People looking for low prices buy PC, they don't even consider Mac. Then they can have a computer with all the "higher numbers", which is more important than getting stuff done.
> pretty sure bet that now you won't need to upgrade for a good few years.
Or you could get a framework and you could actually upgrade parts that are worth upgrading - instead of upgrade as in buying a new one
I bought a framework back in 2020 or so and really wish I just waited a little longer and spent a few hundred bucks more on the M1.
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
I don't mind spending a thousand bucks every 7 years to upgrade my laptop. I've had this macbook air since 2020 and besides the speakers don't being the best... I have no complaints.
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
To sum up the HN wisdom on Apple Silicon Macs:
Before the M4 models: omg, Apple only gives you 8GB RAM in the base model? Garbage!
After the M4 models: the previous laptops were so good, why would you upgrade?
We'll be sure to run our future comments by you to make sure no one contradicts anyone else.
No OLED = skip. Your microBlaBla just causes headeaches.
This is the first time they have not obscenely limited their starter offerings with 8GB RAM. The time it took them to do that is just pathetic. Now I guess this will happen until how long? Maybe 2034 - and starting RAM 16GB. I wish I could say it's a welcome change but in 2024 for such overpriced machine if they are starting with 16GB RAM then that's anything but special. Also, I am sure the SSDs and RAMs are still soldered tight :)
If only they could allow their iPads to be used as a Mac screen natively I might buy a Mini and an iPad and get done with it two use cases but why would Apple want users to be able to do that without extra expense.
The keyboard touch button (top right) is objectively hideous and looks cheap. My current TouchBar may be useless but at least looks nice.
I'm just some dude, looking at a press release, wondering when Tim Apple is gonna be a cool dude and release the MBP in all of the colors that they make the iMac in.
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
I'm sure the specs are great.
In Apple world black means pro. That's why they give you black stickers with pro models and white for everything else.
I really like these new devices, but I’ve found that the latest MacBook Air (M3) is sufficient for my needs as a manager and casual developer. My MacBook Pro M1 Max has essentially become a desktop due to its support for multiple monitors, but since the Mac Mini M4 Pro can also support up to three external displays, I’m considering selling the MacBook Pro and switching to the Mini. I’ve also noticed that the MacBook Pro’s battery, as a portable device, is less efficient in terms of performance/battery (for my usage) compared to the MacBook Air.
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
but does it have touch screen -_-
I used a Surface Pro for 6 years and and haven’t missed the touch screen once since switching back to MBP 3 years ago. I would have missed the handwriting input but that’s what a low end iPad is for.
Would it make sense to upgrade from M2 Pro 16 to M4 Pro 16? (both base models) I mean it terms of numbers, more cores, more RAM but everything else is pretty much the same. I am looking forward to see some benchmarks!
Completely depends on what your workflow is.
No.
Have they published this ahead of other pages or is it just me?
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
yes, it's not anywhere but the press release at this time
Looks like it's updated now.
yep, just updated a second ago
I noticed the same, but it looks like the pre-order link now gives me M4 chips instead of M3.
I recently switched back to using homemade desktops for most of my work. I’ve been running Debian on them . Still have my Mac laptop for working on the go
> while protecting their privacy
This is misleading:
https://news.ycombinator.com/item?id=25074959
"macOS sends hashes of every opened executable to some server of theirs"
> This is misleading: ...
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
What's the deal with running Linux on these anyway? Could one conceivably set up an M4 mini as headless server? I presume Metal would be impossible to get working if MacOS uses proprietary drivers for it...
Metal doesn't exist under Linux but OpenGL and Vulkan work.
> Now available in space black and silver finishes.
No space grey?!
I don't think they had Space Grey on the M3 models either. That was initially my preference, but I went with the Black and quite like it.
I find it very odd that the new iMac has WiFi 7 but this does not... Also it is so aggravating they compare to 3 generations ago and not the previous generation in the marketing stats. It makes the entire post nearly useless.
It is very aggravating, but if they advertised a comparison to last year's model and showed you small performance gains you might not want to buy it.
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
The iMac doesn’t have WiFi 7.
Best time to buy a frame.work Linux Laptop without fomo. I’m done with Apple.
> starting with 16GB of memory
yeah it's about time
As a proud user of an ARM3 in 1992, I'm pleased to be able to see and say that ARM won in the end.
ARMv8 is not much like previous ARMs. But it has won - Intel's latest x86 extension basically turns it into ARMv8.
https://www.intel.com/content/www/us/en/developer/articles/t...
I couldn't disagree more, the defining difference between ARMv8 and x86 is the memory model.
Hmm, I think the difference between x86 and anything else is variable length two-operand instructions. APX changes one of those but not the other.
The difference between ARM and anything else is that ARM has successfully shipped all of its security features (PAC, BTI, MTE) and Intel has not.
The software stack has gotten so bad that no amount of hardware can make up for it.
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
Memory doesn't need to be freed until a different software needs it.
I'm referring to what Activity Monitor app tells me in its memory tab - not the underlying malloc/whatever implementation being used.
It tells me my computer is using 8gb of RAM after a restart and I haven't begun to open or close anything.
Yikes?
Activity Monitor counts everything from I/O buffer caches to frame buffers in the "memory used" metric. Add in that MacOS won't free pages until it hits a certain memory pressure and you get a high usage with no desktop apps open.
This also means that cleanly booted machine with 16 GB will show more memory used than a machine with 8 GB.
Apple suggests you use the memory pressure graph instead to determine whether you're low on memory for this reason.
It's very hard to measure memory use because it's reactive to how much RAM you have; if you have more then it's going to use it. That doesn't necessarily mean there are any issues.
Can we just get a 32 inch iMac, please?
I'm getting tired of everything else being updated yet the product most needed is completely being neglected, and for years already.
And no, I don't wanna buy a separate tiny screen for thousands of dollars.
I'm also not interested in these tiny cubes you deem to be cool.
Lolz the M4 max doesn’t get anything more than 128GB ram in the MacBook? Weird
Cuz of this: was expecting 256GB https://news.ycombinator.com/item?id=41971726#41972721
Am I allowed to work on my laptop if I don't have a PRO cpu?
Only if you work on your hobbies.
The adjectives in the linked article are nausiating. Apple’s marketing team fail as decent humans writting such drivel.
Give us data, tell us whats new, and skip the nonsense buzz filling adjectives.
To quote Russell Brand, just say he sat down, not that he placed his luscious ass in silk covered trousers on a velvetly smooth chair, experiencing pleasure as the strained thigh muscles received respite after gruelling on their feet watching a lush sunset in a cool summers evening breeze.
While we're bashing Apple marketing: `:prefers-color-scheme` is a11y. Take your fucking fashion statements elsewhere.
I'm not sure Russel Brand is the best ambassador for plain English.
https://www.youtube.com/watch?v=o6p0W4ZsLXw
I don't think you understand what a press release is.
Most people buying macs don't care about specs, they care about _what they can do_.
I have an m3 ultra. I don’t think I need to upgrade. I also find it amusing they’re comparing the m4 to the m1 and i7 processors.
I find it amusing how you answer your own "question" before asking it. Why would they target the marketing material at people who already know they aren't going to need to upgrade?
Roopepal did someone piss in your coffee this morning? I had no questions. I’m merely saying that it’s funny they’re comparing to old technology instead of last year’s. It’s a valid criticism. Take a breath.
There is no M3 Ultra.
I meant pro. My mistake. I was going to maybe upgrade it but it appears m4 isn’t much better.