Ya Jakub really did an amazing job with Splatter, Martin and Will at SuperSplat also have something similar and works great. Only difference with us is that ours is light weight, native, support XR (VisionPro) and LOD structure is computed on load time, so you can load in any splat file within a couple of seconds.
Is this the WOW effect of the hardware price-to-performance ratio? The only significant benefit I can see is that the M-series chips have RAM as GPU memory, which is slower than traditional GPUs, but at least you can run things with that memory.
That's why Hollywood movies are so expensive. When they have a scene with spider man jumping around in New York, they have to pay a fee to every owner of real estate depicted in the scene.
Worst of all is of course space documentaries, where you can see the whole Earth. The licensing fees are horrendous.
Apple may have good hardware, but their software support is comically bad. Their support for backwards compatibility (aside from Rosetta) has basically been "FU, I'm apple", which you need for gaming.
Apple has had little to no interest in becoming a real gaming platform. Unless that changes, gamers will more likely be moving into the sweet embrace of Gaben on Linux.
Open the App Store on a iPhone. Of the four tabs, two are game-centric (“Games” and “Arcade”). Another (“Today”) has been consistently using more than half of its features for games.
In their most recent operating systems, they have released a separate app specifically for games (look at that domain, even).
And how about all the games no longer on the App Store?
Say, Flight Control (one of the first games to hit a million unit sales), or the Infinity Blade series (which wiki says was removed due to incompatibility with newer Apple platform changes)?
Both of those examples are old and precede current efforts. Plus, for the third time:
> Whether they’re succeeding at it is another story.
I’m not arguing Apple excels or is even decent at video games, I’m simply pointing out that it’s clear they are interested in having them on their platforms.
Of course they also took the route of inventing a new 3d api Metal which is at odds with Vulkan. There is HoneyKrisp of course, but if one want's decent gaming on an M1 or M2 laptop, Asahi Linux is actually the superior choice.
I don't think one can call it even close to success when the best way to run AAA games on your hardware is to literally replace the entire operating system which uses cobbled together components like FEX and wine/proton, etc... the fact that that works with more games is insane.
> Whether they’re succeeding at it is another story.
You may disagree with their strategy all you like. You may even think they are doing everything wrong, that’s perfectly legitimate. But they are clearly interested in having gaming happen on their platforms. The claim that they aren’t is the only thing I disputed.
The Apple hardware is indeed very nice, but it's not a good environment for gaming. They've traditionally been quite gaming-hostile with refusing to support the later generations on OpenGL. Then there was a wrapper for Windows games called Whisky, but it was finicky and became unmaintained. Apple has their own App store which sells some games, which is in direct competition with Steam and others, so those actors are probably a bit wary of spending too much resources on the platform. Also a lot of gamer culture is related to building your own hardware, which Apple will never support.
Meanwhile gaming on Linux is becoming better than Windows these days, especially with all the trash to be circumvented on Win11, and Steam working hard on SteamOS etc.
I'm a huge fan of Apple's hardware since they introduced their own silicon, but this is just silly. Apple doesn't have the personality needed to court and work with game companies. They're busy expecting everyone to come to them when they'd have to actually work to entice them.
I hate to break it to ya, but Apple Silicon isn't in the top 25 highest-performing consumer GPUs. It's probably not even in the top 25 most-efficient either: https://browser.geekbench.com/opencl-benchmarks
It doesn't seem like Nvidia even has any 3nm GPUs on the market. But sure. When you control for power efficiency, it turns out there's no difference at all!
Please never divide anything by TDP. Use actual power measurements, unless you're trying to ensure your numbers end up being bullshit. (In particular, any number someone claims is a TDP for an Apple processor is made up, because Apple doesn't publish or specify any quantity remotely similar to TDP.)
Here is the scene in browser. Runs plenty smooth on my 5 year old computer. https://splatter.app/s/lzs-xtl
Ya Jakub really did an amazing job with Splatter, Martin and Will at SuperSplat also have something similar and works great. Only difference with us is that ours is light weight, native, support XR (VisionPro) and LOD structure is computed on load time, so you can load in any splat file within a couple of seconds.
Is this the WOW effect of the hardware price-to-performance ratio? The only significant benefit I can see is that the M-series chips have RAM as GPU memory, which is slower than traditional GPUs, but at least you can run things with that memory.
Why can't they make video games with this tech?
Has anyone sorted out a good way to do dynamic lighting or animation with it?
Real world is a shitty map design
From what I have heard „Bodycam“ uses scans of actual locations for its maps.
I think they are coming, should see a few pop up in 2026 for sure.
Will be a nightmare to license the use of all this data for commercial purposes. Each house, each building, requires consent.
Sorry, but the answer is no. Unless you are willing to pay.
prove it, under what law?
That's why Hollywood movies are so expensive. When they have a scene with spider man jumping around in New York, they have to pay a fee to every owner of real estate depicted in the scene.
Worst of all is of course space documentaries, where you can see the whole Earth. The licensing fees are horrendous.
wait... so what about Google Maps ?
He’s kidding
Is there some sort of level-of-detail going on to make this possible? I'd think that's the only way but the tweet says no preprocessing.
Yup we compute the LOD structure on load within a couple of seconds using GPU compute shaders.
What’s the data?
Runs fine on my iphone 14
The MacBooks have insane performance and everything else is falling behind.
It won’t be surprising if Apple overtakes Windows as a gaming platform in coming decades IMO if Intel can’t catch up.
Apple may have good hardware, but their software support is comically bad. Their support for backwards compatibility (aside from Rosetta) has basically been "FU, I'm apple", which you need for gaming.
Apple has had little to no interest in becoming a real gaming platform. Unless that changes, gamers will more likely be moving into the sweet embrace of Gaben on Linux.
Open the App Store on a iPhone. Of the four tabs, two are game-centric (“Games” and “Arcade”). Another (“Today”) has been consistently using more than half of its features for games.
In their most recent operating systems, they have released a separate app specifically for games (look at that domain, even).
https://games.apple.com
They created the Game Porting Toolkit.
https://developer.apple.com/games/game-porting-toolkit/
When they discontinue Rosetta next year, they’ll continue limited support specifically for old games.
https://developer.apple.com/documentation/apple-silicon/abou...
Plus, whenever they announce new chips they feature games and gaming personalities in the keynote.
Those are clear signs they are interested in gaming happening on their platforms. Whether they’re succeeding at it is another story.
And how about all the games no longer on the App Store?
Say, Flight Control (one of the first games to hit a million unit sales), or the Infinity Blade series (which wiki says was removed due to incompatibility with newer Apple platform changes)?
Both of those examples are old and precede current efforts. Plus, for the third time:
> Whether they’re succeeding at it is another story.
I’m not arguing Apple excels or is even decent at video games, I’m simply pointing out that it’s clear they are interested in having them on their platforms.
Of course they also took the route of inventing a new 3d api Metal which is at odds with Vulkan. There is HoneyKrisp of course, but if one want's decent gaming on an M1 or M2 laptop, Asahi Linux is actually the superior choice.
I don't think one can call it even close to success when the best way to run AAA games on your hardware is to literally replace the entire operating system which uses cobbled together components like FEX and wine/proton, etc... the fact that that works with more games is insane.
Again:
> Whether they’re succeeding at it is another story.
You may disagree with their strategy all you like. You may even think they are doing everything wrong, that’s perfectly legitimate. But they are clearly interested in having gaming happen on their platforms. The claim that they aren’t is the only thing I disputed.
The Apple hardware is indeed very nice, but it's not a good environment for gaming. They've traditionally been quite gaming-hostile with refusing to support the later generations on OpenGL. Then there was a wrapper for Windows games called Whisky, but it was finicky and became unmaintained. Apple has their own App store which sells some games, which is in direct competition with Steam and others, so those actors are probably a bit wary of spending too much resources on the platform. Also a lot of gamer culture is related to building your own hardware, which Apple will never support.
Meanwhile gaming on Linux is becoming better than Windows these days, especially with all the trash to be circumvented on Win11, and Steam working hard on SteamOS etc.
I was under the impression the only reason the macbook was mentioned is that it normally wouldn't be able to render this so well.
Seems like a post about the software on display. i.e. "look it can even run on an m2 macbook air"
They'll never take the gaming segment as they would need proper backward compatibility.
No game developer want to update their game continuously just to keep the lights on.
The opposite is happening at the moment, they fell lower than Linux as a gaming platform.
I'm a huge fan of Apple's hardware since they introduced their own silicon, but this is just silly. Apple doesn't have the personality needed to court and work with game companies. They're busy expecting everyone to come to them when they'd have to actually work to entice them.
I hate to break it to ya, but Apple Silicon isn't in the top 25 highest-performing consumer GPUs. It's probably not even in the top 25 most-efficient either: https://browser.geekbench.com/opencl-benchmarks
That chart shows that M4 achieves 25% of the Geekbench scores of GPUs that pull >10x more power. That's definitely efficient.
Are you comparing it with other 3nm GPUs? When you normalize for process, Apple Silicon is definitely not the most efficient raster architecture.
It doesn't seem like Nvidia even has any 3nm GPUs on the market. But sure. When you control for power efficiency, it turns out there's no difference at all!
Process is not equivalent to power efficiency. It's a step-change enabling better designs.
Apple and Nvidia both have 5nm and 4nm GPUs. Take those scores and divide it by the TDP, you'll be shocked at the difference design can make.
Please never divide anything by TDP. Use actual power measurements, unless you're trying to ensure your numbers end up being bullshit. (In particular, any number someone claims is a TDP for an Apple processor is made up, because Apple doesn't publish or specify any quantity remotely similar to TDP.)
Okay, then don't divide by TDP. Measure the GPU wattage frame-by-frame and you'll still end up with similar numbers. The point stands.
> because Apple doesn't publish or specify any quantity remotely similar to TDP
1) That doesn't mean that power usage isn't measurable.
2) They actually do, although it's not a perfect breakdown chip-by-chip: https://support.apple.com/en-us/102027
Neither is their target, they are more in the perf/watt segment.
Which is why it's confusing that the M3 Ultra is less efficient than several 130w laptop chips.