Apple Thread - The most overrated technology brand?

  • Want to keep track of this thread?
    Accounts can bookmark posts, watch threads for updates, and jump back to where you stopped reading.
    Create account

What killed Steve Jobs?

  • Pancreatic Cancer

    Votes: 65 12.2%
  • AIDS from having gay sex with Tim Cook

    Votes: 468 87.8%

  • Total voters
    533
Just a reminder that this happened.

https://youtube.com/watch?v=_GxC4kKD9qA

Jfc those red eyes, she really was stoned off her gourd

Oh, and also, Apple officially announced the long-rumored "Studio AirPods" under the name AirPods Max. Here's what your $550 will get you.

View attachment 1776125

…or at least will get you in March or so, seeing as how they're already backordered that far.

They look like a parody of exactly what a pair of headphones by Apple would look like

I dunno, they'll probably be good but not worth $550, Target had Beats Studio 3 for $175 because I guess this was right around the corner
 
Yeah, the AirPods Max are overpriced. They look nice, but yeah, I wouldn't touch them unless they had a really good sale.

Other news:

Bloomberg's released a report that says that Apple's working on even more powerful chips.
The current M1 chip inherits a mobile-centric design built around four high-performance processing cores to accelerate tasks like video editing and four power-saving cores that can handle less intensive jobs like web browsing. For its next generation chip targeting MacBook Pro and iMac models, Apple is working on designs with as many as 16 power cores and four efficiency cores, the people said.
For higher-end desktop computers, planned for later in 2021 and a new half-sized Mac Pro planned to launch by 2022, Apple is testing a chip design with as many as 32 high-performance cores.
You know those 4 power cores that are stomping basically anything not named Zen3, Intel Gen 11 and Intel Xeon? Let's add 12 to 28 more of those bastards.
Bloomberg's also reporting that Apple's working on 16 to 32 core GPUs, with the high end being 128 cores for MacPros.
 
God, I really hope the "half-sized Mac Pro" thing isn't true. I really thought they had learned their lesson from the Trash Can, and the Cube before it - remember the Cube? Both pretty computers but not what their supposed target markets wanted.

And more cores is nice but I'm hoping they've solved the apparent technical limitations (if that's what they were) not allowing external GPUs or more than two Thunderbolt ports.
 
Yeah, the M1's amazing for the 'lower' end, but the lack of RAM (although the Mac community's working on the assumption that the onboard RAM's about 40%-50% more efficient), and the two Thunderbolt ports really hurt (for the Mac Mini, anyway. There's a reason why the MacBook Pro that got replaced is called the 2 port). I'd say a bigger, meaner, version would have more ports and RAM. Might not be as efficient or compact though.

The external GPU thing is interesting, because the M1 Macs recognise them, just that they don't try to actually run them. Might be a driver thing, might be an Apple thing. Might be both.
 
Yeah, the M1's amazing for the 'lower' end, but the lack of RAM (although the Mac community's working on the assumption that the onboard RAM's about 40%-50% more efficient), and the two Thunderbolt ports really hurt (for the Mac Mini, anyway. There's a reason why the MacBook Pro that got replaced is called the 2 port). I'd say a bigger, meaner, version would have more ports and RAM. Might not be as efficient or compact though.
I thought you could daisychain Thunderbolt anyway, at least if the intermediate devices all were capable of it.
 
I thought you could daisychain Thunderbolt anyway, at least if the intermediate devices all were capable of it.
Not quite "daisy-chain" but it's possible for devices to work as as hubs. They have to have more hardware built into them to support that, though, so few do. It's the same situation as USB.
 
You know those 4 power cores that are stomping basically anything not named Zen3, Intel Gen 11 and Intel Xeon? Let's add 12 to 28 more of those bastards.
Those processors will be going up against AMD and Intel with the equivalent amount of cores and a 32 core Apple chip will have a Threadripper/Xeon pricetag + Apple tax. The chip is tiny right now at 8+8, it's supposedly 16 billion transistors in total, if it scales linearly then 32 CPU cores and 128 GPU cores would be an absolutely massive chip.

I saw some benchmarks testing the M1 single core score against Ryzen/Intel running one core/two threads and it changed things a bit.
 
Yeah, most likely the 32+4 CPU cores and 128 GPU cores would be part of the high end MacPro option, probably the memeic $55k MacPro area, which are 28 core 2.5 GHz Intel Xeon Ws, with 1.5 TB of RAM (12x128 GB), two Radeon Pro Vega II Duos (2x32GB), 8 TB storage and Apple Afterburner (which Apple claims can handle six streams of 8k ProRAW footage simultaneously at 29.97 fps).
 
Yeah, most likely the 32+4 CPU cores and 128 GPU cores would be part of the high end MacPro option, probably the memeic $55k MacPro area, which are 28 core 2.5 GHz Intel Xeon Ws, with 1.5 TB of RAM (12x128 GB), two Radeon Pro Vega II Duos (2x32GB), 8 TB storage and Apple Afterburner (which Apple claims can handle six streams of 8k ProRAW footage simultaneously at 29.97 fps).
Not having a discrete GPU is what makes me actually excited for the future, I want to see what they come up with, that iGPU will be a growing boil on their ass. What they have now is fantastic for a 119sqmm SoC but making the GPU 16 times larger(current design, current node, current info), oooh shit. More CPU cores will be easy in comparison.

I look forward to future Apple things because so far they have defied my dumb-ass expectations of nunchucking themselves into the balls and slumping over so if they say they can pack 128 GPU cores into a speedo without breaking a sweat and do parkour tricks, hell yeah!
 
Last edited:
Ah, one thing I'll quickly mention.

List of games that are being tried on the M1 with a variety of methods, clients and framerates, which is based off this spreadsheet here.

Interesting titles which caught me offguard were Crysis via Crossover, running at 25-45 fps, at 2k res, all options maxed, Dark Souls Remastered, XPlane 11, and an appearances of Metro Exodus, which seems to be 20-40 fps on low settings.

Which, given the Mac and Linux versions of Metro Exodus have been announced in early December, should see a performance boost.
 
Ehhhh, I've only got Mk I eyeballs, 30 fps is fine.
It's fine for computer hardware that costs under $100. I don't know how the fuck 60fps across the board isn't an absolute minimum in a day and age where 1080p is too low for cellphones.

And from a company that's so eager to push technology, they put out laptops without USB-A ports, before there were even very many USB-C devices out. That's what irritates me so much about Apple, their cutthroat approach to culling old technology that still works well, like headphone jacks and USB-A, while keeping around old-ass shit like 30fps video and non-touch monitors. If they approached framerates like they approach everything else, 120fps content would already be on the chopping block in favor of 240.
 
Last edited:
It's fine for computer hardware that costs under $100. I don't know how the fuck 60fps across the board isn't an absolute minimum in a day and age where 1080p is too low for cellphones.

And from a company that's so eager to push technology, they put out laptops without USB-A ports, before there were even very many USB-C devices out. Shit, they should be ready to cull 60 in favor of 120 for everything by now.
I'm sure their internal displays are capable of 60fps at the least. You were complaining above about the capability of their video encoding hardware, which isn't nearly the same thing.

As far as I know - and I might be wrong - TV is still shot at 29.97fps, and most movies still at 24fps because the "smoother" video from using higher framerates makes them look like cheap TV productions… so 29.97fps is still a reasonable rate to aim for. But I wouldn't doubt it could do more if you didn't need to encode six streams of them at a time.

I do agree about USB-A, but "we want people to stop using these things so we're going to stop including them prematurely" is an Apple tradition from all the way back when the first iMacs didn't have a floppy drive. Comes with the territory. (It sounds funny now but it was quite a controversial move at the time and many people were guessing it'd be a failure for that reason alone.)
 
I do agree about USB-A, but "we want people to stop using these things so we're going to stop including them prematurely" is an Apple tradition from all the way back when the first iMacs didn't have a floppy drive. Comes with the territory. (It sounds funny now but it was quite a controversial move at the time and many people were guessing it'd be a failure for that reason alone.)
This came directly from Jobs, who at least had a keen eye for technology that actually was moribund. It was not long before floppies were entirely dead and everyone else stopped including them. Since then, Apple has tried to imitate the Jobsian method but really have no fucking clue what isn't useful or is about to die. So it's just annoying.
 
We should mention that the OG iMac also dropped the PS/2 ports, aka these beloved things.
300px-PS2_keyboard_and_mouse_jacks.jpg


That was in favour of USB keyboards and mice. Granted, Apple's hockey puck mouse was terrible, but let's be honest, at the time everyone used the Intellimouse (which had a scrollwheel, and side buttons!). The USB keyboard was nice though, plus it introduced the 2 USB ports for your mouse.
 
We should mention that the OG iMac also dropped the PS/2 ports, aka these beloved things.
300px-PS2_keyboard_and_mouse_jacks.jpg
Apple didn't use PS/2. Their bus for keyboards and mice looked similar connector-wise, but was completely incompatible and was called ADB. But yes, the iMac did drop those, as well as a custom serial bus for modems and printers as well as SCSI, for USB.
That was in favour of USB keyboards and mice. Granted, Apple's hockey puck mouse was terrible, but let's be honest, at the time everyone used the Intellimouse (which had a scrollwheel, and side buttons!). The USB keyboard was nice though, plus it introduced the 2 USB ports for your mouse.

I'm not a hater of the Hockey Puck. I kinda liked it personally. Really flat and easy to drive with just your fingers rather than your whole hand/arm. But generally, yes, the move to USB was great in that it eventually led to the ability to use any damn mouse (or keyboard) we wanted rather than being limited to those released with Mac-specific connectors.
 
I'm sure their internal displays are capable of 60fps at the least. You were complaining above about the capability of their video encoding hardware, which isn't nearly the same thing.

As far as I know - and I might be wrong - TV is still shot at 29.97fps, and most movies still at 24fps because the "smoother" video from using higher framerates makes them look like cheap TV productions… so 29.97fps is still a reasonable rate to aim for. But I wouldn't doubt it could do more if you didn't need to encode six streams of them at a time.

I do agree about USB-A, but "we want people to stop using these things so we're going to stop including them prematurely" is an Apple tradition from all the way back when the first iMacs didn't have a floppy drive. Comes with the territory. (It sounds funny now but it was quite a controversial move at the time and many people were guessing it'd be a failure for that reason alone.)
From what I've been told part of the reason why a low frame rate like 24fps stuck around for film(the analogue medium) was that lighting and photography were built around 1/24th seconds of exposure, people were trained for that. Higher frame rates = shorter exposure = more light needed when using the same film stock. You can change the film stock to something that is more sensitive to light to compensate but that film will have different properties when it comes to color reproduction and grain, so it will look different, but that can be compensated by maybe rejiggering the lights and tweaking the post-process and... I mean, I wouldn't notice but the leading film makers that dictate what is right and wrong have traditionally been drug fueled auteurs/autists and they will have strong opinions about the tiniest thing. I've never filmed anything so I don't know anything about it but that's the gist of what's been explained to me and it sort of makes sense.
 
Back
Top Bottom