Alleged M4 MacBook Pro unboxing video

(macrumors.com)

80 points | by TaurenHunter 11 hours ago ago

97 comments

  • takoid 10 hours ago ago

    M1 -> M2: 2301 -> 2598 = 12.91% increase

    M2 -> M3: 2598 -> 3082 = 18.63% increase

    M3 -> M4: 3082 -> 3864 = 25.37% increase

    Assuming the M4 Geekbench score is legit, it’s interesting to see that the jump from M3 to M4 has the largest percentage increase in single-core performance since the M1 came out. The M1 -> M2 and M2 -> M3 upgrades were solid, but the 25.37% jump from M3 -> M4 is much more significant. It makes you wonder if we’ll keep seeing even bigger performance leaps with future releases, or if there’s something specific about this generation that’s driving such a large gain. Will Apple keep increasing performance at this rate, or is this an exception?

    • gigatexal 10 hours ago ago

      Will they keep this up? Probably. Indefinitely? Likely not. But look at what it’s done. There are compelling ARM hardware outside of Apple now due to the renaissance in performance that this am series chip has shown.

      I’ve an M3 max and now I can’t wait to see what an M4 max will do. 256GB ram in the top end laptop? 25-50% more cores?

      • cheema33 9 hours ago ago

        > There are compelling ARM hardware outside of Apple now

        Not only that, this may have finally forced Intel to innovate. Their new Lunar Lake CPUs coming out now are actually competitive with Apple chips for both performance and power. I thought I might not see X86 become competitive again in my lifetime.

        • sofixa 8 hours ago ago

          > Their new Lunar Lake CPUs coming out now are actually competitive with Apple chips for both performance and power. I thought I might not see X86 become competitive again in my lifetime.

          I hadn't heard about them, so Googled it and... "Manufacturered by: TSMC". So Intel's inefficiency was more due to their fabs being behind than inherent x86 downsides?

        • jodleif 2 hours ago ago

          They might be competitive on battery-life, but definitely not performance per watt.

      • kev009 9 hours ago ago

        That's a good point.. I wonder how much network effect this has on substantiating ARM in server space too. Vendors have been trying that since the early twenty-teens and it was not going anywhere fast until Apple shipped the M1. I don't think it is a direct cause->correlation, but the existence of high powered dev machines running clang and gcc certainly lowers a lot of frictions even in higher level languages like Go.

        • sofixa 7 hours ago ago

          AWS' Graviton line was released in 2018 to much fanfare, and there was a lot of content about all the cost savings per performance it brings immediately. Ampere were founded in 2017, and in early 2020 announced partnerships with Oracle for ARM based servers

          M1 came out in November 2020.

          A lot of Linux tools and systems and applications became available thanks to Raspberry Pis becoming very popular with hobbyists, and then Graviton pushing more "enterprise" stuff. For more user facing GUI-powered applications, it lagged, and still lags, but there Apple and Microsoft's ARM efforts were the main driving force.

        • dagmx 8 hours ago ago

          Anecdotally, a lot of software in the graphics space didn’t have arm availability till after the M series were released. A lot of the windows/linux arm ports of libraries didn’t exist till after despite having hardware available earlier.

          My work on Linux clusters was dramatically improved.

          Having even one vendor prove that arm is worth paying attention to is the tide that lifts the ships of all other vendors in the arm space .

    • leecb 9 hours ago ago

      I wonder how much of this increase is solely enabled by TSMC process upgrades, either through higher clock speeds or increased number of transistors per core? It's kind of interesting that every iteration of the M series has corresponded to a change of TSMC process.

      M1: 16 billion transistors at 3.2 GHz on TSMC N5

      M2: 20 billion transistors at 3.7 GHz on TSMC N5P

      M3: 25 billion transistors at 4.05 GHz on TSMC N3B

      M4: 28 billion transistors at ?? GHz on TSMC N3E

      • dagmx 9 hours ago ago

        N3E is slightly less efficient than N3B though, it just has higher yields.

        • Fluorescence 7 hours ago ago

          Do both efficiency and yield play a part in the final off the shelf performance?

          e.g. does higher yield mean you get more product in higher bins? I imagine we need to judge a process by the distribution across bins not just a max efficiency or a total yield.

    • hggigg 9 hours ago ago

      Numbers are plausible. The M4 iPad is close to an i9-14900k in single thread.

      I’ll buy an M4 mini when one comes out to replace my M1.

      • hulitu 3 hours ago ago

        > The M4 iPad is close to an i9-14900k in single thread.

        For how many microseconds ? /s

        • hggigg 3 hours ago ago

          A good point which is why I want it in a Mac Mini where they (hopefully) haven't compromised it thermally as much!

    • Retr0id 10 hours ago ago

      Plausibly a large part of the perf bump comes from the SVE and SME extensions, for workloads that can put them to use.

    • bhouston 7 hours ago ago

      > Will Apple keep increasing performance at this rate, or is this an exception?

      I want to believe that but a lot of the performance gains have been the result of frequency boosting and you can not do that forever.

      What is the current limit to that? Probably around 5Ghz right?

      They are squeezing more out of the silicon but the frequency has been the primary contributor recently.

    • throw7374 10 hours ago ago

      Those are nice numbers. But in reality Air M2 had worse performance, because cooling system was much worse.

    • DeathArrow 7 hours ago ago

      I don't think Geekbench is a relevant benchmark for laptops. I would look at other benchmarks such as Cinebench, before drawing any conclusions.

      • saagarjha 7 hours ago ago

        Why not?

        • michelb an hour ago ago

          Geekbench usually doesn't run long enough to trigger the throttling on the laptop chips.

    • 10 hours ago ago
      [deleted]
  • bhouston 9 hours ago ago

    The box is distinctly for an M3 MacBook Pro from the design. Apple has a different screen on each version and the M3 had the black/grey pipes design features in the video.

    https://www.reddit.com/r/macgaming/comments/17g3xh0/leaked_m...

    Did Apple reuse the graphics on M4? I suspect not.

    Is this a review unit that was released early with unofficial packaging? I didn’t realize Apple did that?

  • satvikpendem 10 hours ago ago

    > 25% faster M4 chip: The box lists this 14-inch MacBook Pro as having an M4 chip with a 10-core CPU, whereas the current model has the M3 chip with an 8-core CPU. An alleged Geekbench 6 benchmark result shared by the YouTube channel behind this leak suggests that the M4 chip will be up to 25% faster than the M3 chip. Apple already introduced the M4 chip in the iPad Pro in May, and it indeed has up to a 10-core CPU and up to 25% faster performance compared to the M3 chip.

    > 16GB of RAM: It was previously rumored that 16GB of RAM could become the minimum for all future Macs, and the alleged packaging indeed suggests that the next 14-inch MacBook Pro would start with 16GB of RAM, unless it is somehow a built-to-order configuration that will be offered on Apple's online store.

    > Three Thunderbolt 4 ports: The box suggests the base 14-inch MacBook Pro will have three Thunderbolt 4 ports, up from two Thunderbolt 3 ports on the current model.

    > Space Black: The box suggests that Space Black would become a color option for the base 14-inch MacBook Pro, whereas it is currently exclusive to models configured with the M3 Pro or M3 Max chip.

    Aside, I still have an M1 Pro 16", is it worth upgrading to an M3 (or future M4)? It feels like my M1 can still do everything I need it to (coding, compiling, running emulators for mobile development), but for those that did upgrade, do you feel it was worth it, or are the gains incremental?

    • nfriedly 10 hours ago ago

      > It feels like my M1 can still do everything I need it to

      I think you answered your own question: No, you don't need the upgrade.

      (I'm in the same boat as you, for what it's worth. My wife and I both have M1 machines, and they both do what we need, so we're not upgrading.)

      • darnfish 10 hours ago ago

        Still interested to know the answer to this Q tho:

        > for those that did upgrade, do you feel it was worth it, or are the gains incremental?

        • jamie_ca 10 hours ago ago

          I was comparing notes with a new coworker the other week - I've got an M1 pro, he's got an M3. Running an individual test for a feature we were working on (in Rails) lines up with the 25-30% numbers quoted up thread a bit, but that's the difference between my machine taking 40sec or so vs his taking 30 or less.

          It's noticeable when you've got it to compare against, and I'm looking forward to work's hardware refresh cycle bumping me to M4 next year, but I still don't think I'd upgrade if it were a personal machine (and if I can get a decent discount to buy out the depreciated M1 for personal use I'm planning on that rather than looking at anything new).

          • Jtsummers 10 hours ago ago

            That's my take as well. I used an M1 and an M3 for the same work, the M3 was definitely faster but the M1 wasn't bad. Both were substantially better than the Intel Macs I was using before (for the same tasks as well). Times cut from 5-10 minutes down to 1-2 minutes on the M1, and another 10-30 seconds shaved off with the M3. So faster, but not as game changing as the M1 itself.

            The one thing I do like with the M3 vs M1 is when I had to, for reasons, run an x64 VM. I felt it was barely usable on the M1, but it was tolerable on the M3. The performance on the M3 of an x64 VM was close to the old Intel Macs I'd ditched, which were acceptable but hardly great. The M1 running a VM felt like a time warp back to my college days in the early '00s.

        • fiddlerwoaroof 9 hours ago ago

          I have a personal 2020 M1 MBP and a work MBP with an M2 or M3 CPU. IMO, the difference isn’t worth an upgrade yet, especially if you compare it to the upgrade from intel to Apple silicon.

        • usefulcat 9 hours ago ago

          I have an M1 mini and and M3 air. I use both of them in very similar ways and don't see a noticeable difference in performance. They're both more than fast enough for everything I do.

        • wwwlouishinofun 10 hours ago ago

          It may be faster when switching between multitasking. My problem is mainly that the screen of my m1 mba is too bad. It would be great if I could replace the screen module separately.

          • ezst 9 hours ago ago

            That's why my next thing is a framework laptop (I'm aware how tone-deaf it might sound in an Apple thread, but I really think Apple should be leaders on this front instead of making their devices less and less upgradeable and repairable)

            • gibolt 8 hours ago ago

              They are finally starting to move somewhat in this direction. Maybe we end up closer to that in 3-4 more generations

              • ezst 7 hours ago ago

                Let's see, one can hope, but before that they have to abandon decades of a business model based on rent seeking and up-selling memory and storage at 10× the price. Wall street won't like it and only the EU can incentivize them enough.

      • LoganDark 10 hours ago ago

        > I think you answered your own question: No, you don't need the upgrade.

        They asked if it would be worth it, not if they needed it.

    • Farbklex 7 hours ago ago

      I am an android dev who works with an M1 Pro 14" on some client and private projects and I also use an M3 Pro 16" provided by one client for a specific project.

      I can't tell the performance apart. Everything is absolutely good enough on both machines.

    • adastra22 10 hours ago ago

      What version of the M1? I just “upgraded” from an M1 Max to a M3 device. The single core performance is higher on paper, but I don’t really notice it. I do notice the fewer cores. If you are on a base model, it may be worth upgrading to a Pro or Max CPU.

      • satvikpendem 8 hours ago ago

        M1 Pro, the 16" MacBooks (and by extension, all MacBook Pros) only have Pro or Max chips, not base.

    • gyomu 7 hours ago ago

      OLED should be coming to MacBook Pros within a year or two. If your current laptop still works for your needs, I’d wait for that to upgrade.

    • ValentineC 10 hours ago ago

      I'd upgrade not for the specs, but just for MagSafe.

      • satvikpendem 8 hours ago ago

        My 16" M1 MacBook Pro currently already has MagSafe.

      • usefulcat 9 hours ago ago

        It is nice to have magsafe back. Without that I'd be paranoid that the USB sockets would wear out over time from constantly being used for charging.

        • ezst 9 hours ago ago

          Don't, in my experience (and that of everyone around me), USB-C is very sturdy. My wife had a MB Pro (last Intel, with USB-C) and recently upgraded to the latest gen with the magnetic connector. It's really annoying that we can't use it on anything but her laptop, it really feels like going backwards and as a result we don't carry it. And for what? So that it gives when you pull it gently? That's also pretty much defining USB-C on all my devices... Apple just really likes their dongles and that's all there is to it, I believe.

          • dagmx 8 hours ago ago

            Your rationale doesn’t make sense since every single usb port on the device can also charge it.

            The reasons for MagSafe are likely

            - people on average prefer a connector they don’t need to guide in or worry about pulling their laptop down with. This is borne out in the reactions even on HN when it was reintroduced.

            - it frees up the USB-C ports for actual data use. If people are often dedicating one port for charging, it’s almost a waste to run data lines to it.

            - MagSafe came out at a time when USB-C couldn’t carry as much power as the MagSafe connector could.

            • ezst 7 hours ago ago

              > Your rationale doesn’t make sense since every single usb port on the device can also charge it.

              It does, because as I said, we now have a charger that basically amounts to e-waste. As the result of its supposed benefits being very weak and its downsides very real.

              • dagmx 3 minutes ago ago

                The only part that is single purpose is the cable. Your charger can be used for standard usb-C.

                The supposed benefits are very real whether you appreciate them or not, as I listed in my comment. , whereas the downsides are that you have one extra cable that you can choose not to use.

                It feels like you’re making a mountain out of a molehill

              • mrkstu 6 hours ago ago

                The charger is still USB C- one end of the unattached cable is MagSafe, the brick side is USB C, and can charge other USB C devices using a C to C cable - I often use it to quickly charge my iPad Pro.

          • imiric 8 hours ago ago

            > in my experience (and that of everyone around me), USB-C is very sturdy

            That might be due to Apple's build quality, but the USB-C connector has been anything but sturdy in most machines I've used. The connection becomes wobbly after just a few months of very careful usage, and tends to cut out if the cable is pulled in a certain direction. Connecting an external disk drive is a plug-and-pray situation. I've even had the connector dislodge entirely on one machine. USB-C may have given us reversibility, but USB-A had a much sturdier connection.

            To say nothing of the uncertainty of which protocol is supported and which power profile will be negotiated by the combination of cable and devices. It's a UX mess.

            • Tade0 8 hours ago ago

              I can't agree on the sturdiness of USB-A. I used to buy laptops with at least four USB ports specifically so that they wouldn't all wear out before it was time to upgrade.

              Meanwhile my previous phone from 2017, which had USB-C, served for six years without issue except for the accumulation of dust and grime in the receptacle, which caused the behaviour you're describing, but once I dealt with that it was fine.

              In USB-C the leaf spring holding the plug in place was moved to the plug, so typically it's the cable that's the source of the problem.

              • imiric 6 hours ago ago

                That hasn't been my experience. Dust and wear out are common issues with both connectors, but I'm specifically referring to the quality of the connection. USB-A being larger and rectangular simply achieves a better physical coupling than USB-C. Cables are a source of problems as well, but I've specifically had failures of the female ports and not any individual cable. More so on laptops than mobile devices, to be fair, but I never had these issues with USB-A.

            • fragmede 8 hours ago ago

              the fact that I can have a usb-c cable that is directional because I used a usb-a -> usb-c adapter on one end of a usb-c to usb-a cable still weirds me out. I'm not evil enough to release cables that look normal but only work on in one direction, but someone could.

          • satvikpendem 8 hours ago ago

            This has been my experience as well, I don't carry the MagSafe cable anymore either as my one USB C to USB C cable and charger work just fine everywhere. Rarely am I charging two or more devices at once, something that helps when the Apple Silicon Macs have such long battery life, combined with the fast charging on both my MacBook and phone.

        • wkat4242 8 hours ago ago

          The good thing about USB-C is that all the springy parts are in the plug now, unlike with USB-A where the plug was non moving and the socket had the springs. Meaning the part that wears out the most is in the cable which is much cheaper to replace.

        • lostlogin 7 hours ago ago

          MagSafe can also cause problems.

          I seem to get black sand in mine, which is magnetic.

          It leads to scorch marks, which can’t be good.

          New Zealand has a lot of black sand beaches. I’ve never taken a laptop to one and only go to them a few times year. Why all the laptops on the house end up with magnetic sand in the port is a mystery.

          That said, I love magsafe.

          • grecy 6 hours ago ago

            I use a wooden toothpick occasionally to get it all out, works well

    • pr337h4m 8 hours ago ago

      >16GB of RAM could become the minimum for all future Macs

      If true, this would be a big deal for local LLM inference.

  • x0xMaximus 10 hours ago ago

    Assuming this is real, I can't help but think this could push Apple to accelerate their reduction on Chinese production. Is there any other realistic way this could have leaked other than through a Chinese backdoor?

    Are these being made at Quanta or Foxconn? Since the start of the war, there has been a massive increase in sanction evasion through China and Central Asia. I would have to assume Valentin used a connection that has formed over the past 2 years in order to get a North American SKU as Russia's richest YouTube blogger

    • jnaina 6 hours ago ago

      Apple is now making some of the MacBooks in Vietnam. Quite a bit of the Russian diaspora in Vietnam.

  • spockz 10 hours ago ago

    I have one gripe with these awesome machines, the cpu core scheduling. As part of developing our application we run benchmarks. Now every so often we get strange performance dips that I can only attribute to either or both the load generation process or the application process temporarily being scheduled on an efficiency core.

    On macOS native there would be profilers I could use to at least detect it. But when using docker that gets all smudged into a single vm process.

    Does anyone know how to detect this? Or better prevent? The only way I found is making a development kernel build to switch the boot core to a performance core and then disable all efficiency core. This is not practical on company hardware.

    • saagarjha 7 hours ago ago

      If you quiesce the rest of the VM then you should be able to correlate the CPU usage of the virtualization process to your workload, right?

  • tivert 10 hours ago ago

    Man, the sanctions against Russia really aren't working. They're even getting US technology before it's released to Americans!

    • boffinAudio 5 hours ago ago

      Sanctions never work, they are a crime against humanity because they only ever effect the lower classes of the target nation, and never its ruling class who are responsible for the acts being sanctioned in the first place.

      It is a broken approach to democracys' biggest failure: its lack of otherwise effective diplomacy when forced to engage with non-aligned foreign nations.

      • whamlastxmas 2 hours ago ago

        Sanctions absolutely cost the wealthy lots of money. It maybe doesn’t impact their quality of life but does cost them a lot of money, and we know how much they like money.

        • boffinAudio an hour ago ago

          Sanctions cost poor people their lives. They have no effect on the wealthy other than to motivate the wealthy to route around them - which they always do.

          Sanctions just don't work, never have and never will, and should be treated as crimes against humanity.

  • submeta 10 hours ago ago

    In my case going from M2 to M3 with more RAM (from 16 GB to 36 GB) will give me a greater performance increase than going from M2 to M4 both with 16 GB.

  • amanzi 4 hours ago ago

    I'm sure the next MacBook Pro won't have a notch. This was a stupid design decision and was only done to align with the "distinctive" notch of the iPhone. Now that this is gone, I'm sure the next the MacBook Pro will have the same dynamic island thing.

  • 11 hours ago ago
    [deleted]
  • FootballMuse 10 hours ago ago

    I guess Thunderbolt 5 will have to wait until M5.

    • imeron 9 hours ago ago

      I do wonder if they can continue improving interconnect specs as with Thunderbolt 4 the cable seems to be already the limiting factor. It’s thick, expensive and only available in very short lengths.

      • WithinReason 7 hours ago ago

        Thunderbolt was supposed to be optical originally, maybe it's time. There are rumors of PCIE 7 being optical too.

  • slater 10 hours ago ago

    Reminds me a bit of those blurry photos of fake iProducts ca. 2005, taken in a blue elevator or loading dock. Anyone remember those?

  • idk1 7 hours ago ago

    I feel like we can say this is not real, I'm leaning in that direction, Apple would rarely launch a new laptop without changing the background image, each new one gets a new wallpaper. It's a core part of making the laptop look new in their marketing and on the boxes.

    • 7 hours ago ago
      [deleted]
  • 11 hours ago ago
    [deleted]
  • monroewalker 10 hours ago ago

    Still using an Intel MacBook Pro. Time for an M3 or wait for M4?

    • adastra22 10 hours ago ago

      If the M4 really is coming out in a month, wait. Either get a 25% improvement “for free” (compared to what you’d pay now), or get a used one for cheaper as people with more money than sense rush to upgrade.

      But it is way past time to upgrade your Intel Mac to Apple Silicon. The difference is night and day.

      • bigtex 3 hours ago ago

        Probably have a chance at getting a nice discount on a refurb/discontinued model after the M4 comes out.

    • kylec 10 hours ago ago

      At this point wait for the M4, seems like it's going to drop in a month or so, and then get that. Though truth be told you could jump to an M4 MacBook Air and still get better performance than your Intel machine. Most people do not need the "Pro" or "Max" chips, even professionals.

      • saagarjha 7 hours ago ago

        M4 MacBook Air is rumored for next year.

      • kruxigt 7 hours ago ago

        [dead]

    • Aeolun 10 hours ago ago

      M3 is already a massive improvement, but since prices don’t really get lower as time goes on, you might as well buy the M4 when it is fresh?

    • salojoo 9 hours ago ago

      You should wait for M5 to get M4 for a discount

  • pier25 10 hours ago ago

    I was hoping Apple would get into TB5 for the M4 Macs.

    • bzzzt 6 hours ago ago

      I'm very interested in your use case where 40Gbit/s is not more than enough.

      • pier25 2 minutes ago ago

        You don't really have 40gbits for data transfer in tb3/4. A good chunk is reserved for display port data.

        Even when using USB4 I still haven't seen an ssd case that can use all that bandwidth.

        Use case is video editing and maybe even egpu for Blender.

  • refurb 4 hours ago ago

    My question is - how does a Russian guy get his hands on an unreleased Apple product?

    Smuggled out of a factory?

    I would have guessed Apple had security locked down tight for new products. I know it's China, but usually dangling "Apple will punish you by pulled future products" is enough incentive.

  • yieldcrv 10 hours ago ago

    I just want to know max RAM configuration in the laptop form factor, and the memory bandwidth. I would like more than 800Gb/s memory bandwidth

    My M1 Max can do everything I want it to, although it maxes out at 64gb RAM. but my use cases expand above 128gb and would change my approach

    specifically would allow my machine to act as nodes on more distributed networks, maybe even Solana, and run larger language models - at all and faster

    • raxxorraxor 2 hours ago ago

      While direct memory access of Macs is nice, they still do have performance issues compared to a PC GPU. Of course the pricing would be entirely different with >128gb here. Seems like Apple is the budget solution now.

      I use an adapted LLM on an old M1 macbook air and it runs very well, even for autocompleting code. Image synthesis is quite slow though, even on newer Apple chips.

      Hope the moat of nvidia dies soon.

  • 4 hours ago ago
    [deleted]
  • behnamoh 10 hours ago ago

    I want to know its bandwidth. How does it compare to Nvidia 4090? If it exceeds that (at least in the Max/Ultra model), it'll be huge; we'll suddenly have solid +190GB of VRAM (technically URAM) to run multiple LLMs on.

    • andersa 10 hours ago ago

      That comparison doesn't make sense. If you want 192GB VRAM, then that is the size of 8 4090s, so to make it competitive, the bandwidth also needs to be 8 times higher. Or 8TB/s in this case.

      • klohto 4 hours ago ago

        What are you talking about? Why would you multiply the bandwidth? 8 4090s is still 1000GB/s. While the M2 Ultra is 800GB/s with a top of 192GB VRAM. Metal can access ~155GB, so you need a bit more, but your comparison makes absolutely zero sense.

        • andersa 38 minutes ago ago

          When you have 8 GPUs, you can use more than 1 at a time.

        • boroboro4 2 hours ago ago

          There are different ways to run LLMs on multiple GPUs, one of them (called tensor parallelism) in low batch scenarios would be multiplying bandwidth between different GPUs. So no, 8 4090s is not 1000 GB/s.

          • klohto 2 hours ago ago

            let me know how is the PCIe bandwidth treating you

      • behnamoh 9 hours ago ago

        This is so wrong I don't even need to correct it.

        • saagarjha 6 hours ago ago

          You don't need to post like that, either. And yet here we are.

  • znpy 9 hours ago ago

    “Space black” is considered an upgrade nowadays?