Apple rumored to stop using Intel chips by 2020

glasspusher

Member
SoSH Member
Jul 20, 2005
9,973
Oakland California

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
I know I'm not the majority, but I really don't care anymore. I was so thankful years ago for Intel and x86 support and Boot Camp and everything else. In 2018, I don't need Windows really at all anymore at work. Office 2016 on Mac is actually quite good. Other things like VMWare VCenter, i can use on my Mac with the Web GUI. Really the only thing I ever use my Windows image for is Visio, which I can get rid of completely with LucidChart.

I used to be in my Windows VM constantly but I really don't need it at all anymore. Intel has been a thorn in Apple's side now for awhile on the Macs.
 

jimv

Member
SoSH Member
Feb 5, 2011
1,122
Intel's execution in processors has been abysmal over the past few years. The 14nm rollout was delayed and piecemeal. The 10nm rollout is so far behind they had to produce additional 14nm designs to have some new product to market/sell. I'm not sure they have a roadmap beyond that (iow, 7nm etc). Meanwhile Samsung and TSMC have maintained their fabrication cadence

Given that performance Apple should, very publicly, consider alternatives. But I have questions
  • Can ARM processors, with the Apple tweaks, deliver the requisite user experience. It worked for the iPhone, will it work for iMacs?
  • The switchover to iOS is interesting, can they pull off a new operating system and new hardware at the same time?
  • Who will fabricate the (iirc) 15-20 million chips?
 

Blacken

Robespierre in a Cape
SoSH Member
Jul 24, 2007
12,152
Most of the markets where an ARM Mac makes sense are markets where a less shitty iPad makes just about as much sense. It's very likely that we'll see downmarket Macbooks or iPad/Macbook convergence where battery life is a priority. But x86 is used because x86 has performance that, outside of fairly contrived benchmarks, not even Apple's ARM stuff is able to match. I am not going to be encoding video on an ARM Mac in 2020.

But it might be a Threadripper, depending on where they go with the new Mac Pro.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
I wouldn't be surprised by a move to ARM in the lower end and AMD in the higher end with the higher end having an ARM chip and AMD in it.

I'd like to see how much more power Apple can get out of an ARM chip on something with a fan though and more space for cooling.
 
Last edited:

wade boggs chicken dinner

Member
SoSH Member
Mar 26, 2005
32,663
I know I'm not the majority, but I really don't care anymore. I was so thankful years ago for Intel and x86 support and Boot Camp and everything else. In 2018, I don't need Windows really at all anymore at work. Office 2016 on Mac is actually quite good. Other things like VMWare VCenter, i can use on my Mac with the Web GUI. Really the only thing I ever use my Windows image for is Visio, which I can get rid of completely with LucidChart.

I used to be in my Windows VM constantly but I really don't need it at all anymore. Intel has been a thorn in Apple's side now for awhile on the Macs.
I'm not a tech guy, but I'm confused. Isn't the only reason you can do all of this on your MacOS is because your Mac is running an intel chip?

I'm interested in this thread because I may be the only person in this forum trying to keep up on MS's attempt to combine Windows 10 with Snapdragon processors. The first batch of machines are running Snapdragon 835, which is enough to power Windows 10 mobile but not enough to power any 64 bit x86 apps, among others (here's an article discussing these limitations: http://www.zdnet.com/article/windows-10-on-arm-it-will-be-more-limited-and-heres-how-reveals-Microsoft/). Bottom line is that most reviews say to skip the Snapdragon 835 machines and see how much better the Snapdragon 845 machines will be.
 

shaggydog2000

Member
SoSH Member
Apr 5, 2007
12,311
It seems like another step, along with starting to make their own displays, to control more of their supply chain and make more things in house. Eventually they would aim to buy all the commodity parts on the open market, and make all the differentiating value add parts themselves.
 

nighthob

Member
SoSH Member
Jul 15, 2005
12,902
I wouldn't be surprised by a move to ARM in the lower end and AMD in the higher end with the higher end having an ARM chip and AMD in it.

I'd like to see how much more power Apple can get out of an ARM chip on something with a fan though and more space for cooling.
Most of the markets where an ARM Mac makes sense are markets where a less shitty iPad makes just about as much sense. It's very likely that we'll see downmarket Macbooks or iPad/Macbook convergence where battery life is a priority. But x86 is used because x86 has performance that, outside of fairly contrived benchmarks, not even Apple's ARM stuff is able to match. I am not going to be encoding video on an ARM Mac in 2020.

But it might be a Threadripper, depending on where they go with the new Mac Pro.
Yeah, I can see them moving their MacBook Air and iMac lines to ARM, the recent intro of the iMac Pro line seems to point in that direction. Put cheaper ARM based chips in the consumer end PCs and more powerful processors for the niche prosumer market, helping you to catch more consumer sales by lowering hardware prices.
 

Blacken

Robespierre in a Cape
SoSH Member
Jul 24, 2007
12,152
I wouldn't be surprised by a move to ARM in the lower end and AMD in the higher end with the higher end having an ARM chip and AMD in it.

I'd like to see how much more power Apple can get out of an ARM chip on something with a fan though and more space for cooling.
They can probably do pretty well. Even very well. They can address ARM's biggest performance problems (limited execution resources, relatively short OOE depth). These are linear problems to solve and the transistor budget goes way up when you can use a chip that doesn't have to fit in an iPhone. However, this necessarily implies splitting the silicon team and tasking large parts of the team with building what is almost unexplored territory in ARM.

We aren't talking modular components here. We're talking about significantly changing the overall guts of the chip. Apple's hardware design teams are, obviously, very good. But that's a difficult step to make. They design the A-series and S-series chips in-house, but, despite the "it's all custom" claims out there, the way they behave suggests that they're relatively incremental work on standard ARM designs (A11 is a pretty standard big.LITTLE design with some Apple secret sauce, S1 is an ARMv7 chip notable mostly for its downscaling). Even small permutations require a lot of work to do and to get right, as we've seen from the size of Apple's team. For PC-scale changes, you are looking at drastic reworks of the chip to the point where you almost might as well not use an ARM chip. I'm sure Apple is planning to try, even if they haven't so far. But they aren't guaranteed of success--the fuckup potential is high, because all that software written for x64, even if it's ported fat-binary style to OS X ARM, has implicit expectations around performance and behavior that may not hold if they see a significant perf regression.

Apple will be competing with Sapphire Rapids at that point, with Intel probably being in 7nm territory and having all that accumulated experience from now 'til then. I wouldn't let Intel's complacency over the last few years put much doubt into the fact that Intel is the best in the world at what they do. (I am excited to see what they do now that AMD has figured out which is their ass and which is their elbow.)
 
Last edited:

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
I think you guys are misreading this a little bit. It's a business decision primarily, and a technical one secondarily, but I love the technical part so mostly I'll focus on that.

Apple bought PA Semi (Dan Dobberpuhl's company) many years ago - this is the team that is implementing their ARM cores. Apple has an ARM architectural license, dating from the earliest days (I believe going back to Newton), so free reign to do a lot of things. They are trying to balance the need for x86 ABI compatibility with the cost of being attached to Intel for a key component - this is the basic business decision. Apple has staffed that ex-PA team to high heavens - I believe they had 150-200 people at the beginning and have heard rumors of 1000+ now. So they can probably get quite a lot done, but they are still small compared to the teams at Intel doing this stuff.

Intel (with x86) is biased micro-architecturally to higher performance, and ARM to lower power, but there is nothing at the ISA level that forces that, it's just what they make. In fact, in energy (not power) terms, you can see (fig 10) that energy for the x86 is not way out of line with ARM's A15 - depends on workload but in some cases x86 is more efficient. You can also see in that link at fig 11 that the power-MIPS (BIPS) tradeoffs are really just a line that different micro-architectures sit on at different points. Put another way, I think you can make a big high-power ARM core, but because the ARM licensees are micro-architecturally biased to low-energy, no one does it yet - Apple might be able to, but there's a lot to figure out here. Since that link is an ARM-centric blog, I need to think a little more about where they (ARM) might have inserted bias, but as far as I know it is derived from a pretty fundamental paper from Wisconsin published ~5 years ago. It would also be nice to see those results on A57 or newer, since A15 is rather long in the tooth, but I bet it holds up.

I liked the comments from @jimv - but what I have read is that Intel's 10nm might be competitive with GF (i.e. Samsung's) 7nm in terms of actual achieved density - and maybe better on some process performance metrics. So even if INTC has bungled some things, they are pretty damn good at this process stuff, maybe good enough. I need to dig more on TSMC's equivalent in that node.

Last comment, because I have to as a security dude - I think that barring some major miracle, out-of-order execution + superscalar architectures that have dominated the last 20-25 years are well and truly fucked in the face of Spectre - the Meltdown stuff can be basically patched, but I have serious doubts about Spectre. The memory hierarchy we are used to might have to substantially change (can't have shared L2/L3 caches anymore) or maybe have to migrate to massive core-count superscalar but non-speculative ultra-high-clock-rate devices, which poses its own issues in terms of power/energy, or even feasibility.

I need to find a Sons of Yale Patt board to get more gossip on this stuff - it's fascinating times right now.
 
Last edited:

jimv

Member
SoSH Member
Feb 5, 2011
1,122
....but I love the technical part so mostly I'll focus on that.....
did we just become friends?

....Intel's 10nm might be competitive with GF (i.e. Samsung's) 7nm in terms of actual achieved density - and maybe better on some process performance metrics. So even if INTC has bungled some things, they are pretty damn good at this process stuff, maybe good enough. I need to dig more on TSMC's equivalent in that node.
Intel's 10nm is roughly equivalent to Samsung/TSMC 7nm in transistor density. Can't remember where I read it but TSMC claims their 7nm roll out is on schedule for 2018. Meanwhile Intel's 10nm rollout has been pushed back into 2018. Given recent track records first to market is unclear

For years Intel tick-tocked their way into a process advantage. It seems to be gone now and they need to get back up to speed or it will start impacting the bottom line
 

Blacken

Robespierre in a Cape
SoSH Member
Jul 24, 2007
12,152
Sure, sure, @jimv gets a shout-out and I get NOTHIN'. :(

Intel (with x86) is biased micro-architecturally to higher performance, and ARM to lower power
Only insofar as execution resources (uarch points of contention) and pipeline length are concerned, as it has been explained to me. And both are pretty well-understood chip facilities--the hard part comes in that the interconnects at practical and usable speeds are more than just rectangle slinging.

The ISA doesn't really matter so much; my intuition is that turning ARM into a desktop-scale pipelined processor with x86 levels of OOE will turn it into an x86-class power hog.
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
Fair enough @Blacken - your second post came in while I was typing my screed into the post editor and was pretty spot-on - I probably should have refreshed the thread before posting. I didn't really have much to say about your first post though.

As far as what you say - you probably know this, but the pipeline depth is what drives clock frequency (roughly), with some scaling factors that are different between ARM/x86 due to the relative difficulty/ease of implementing the instruction decode. The additional execution units are for allowing more parallelism (ILP = instruction-level parallelism, that's the term in the classic literature). However, Spectre basically puts all that at risk - the more speculation and OOE you do, the more difficult it is to protect sensitive data, so what happens next is going to be quite interesting, to me.

And your intuition is correct, insofar as the research predicts (which actually would not have been the common wisdom ~15 years ago - that's why that Wisconsin paper was considered interesting). There is nothing inherent about ARM at the ISA level that prevents you from turning it into an OOE monster. Theoretically an ultra-low-power x86 is possible too - I'm not sure if anyone has really thought about the scaling factors and what happens when you detune an x86 design to such a degree, maybe the performance is non-linear / it requires some minimal amount of microarchitectural investment to achieve decent results.

I'm not sure which interconnects you are referring to - the on-die stuff is pretty well-understood now (crosstalk and routing challenges can be largely mitigated), and everyone has the same off-die problems for memory interconnect since the memory is a standard - unless you are talking about L3 which is more a packaging / pins problem. Intel does have some serious package-design whiz-kids in house for that stuff.

I just read that gizmodo article - the linked twitter thread is a good read and roughly aligns with my intuition. Some of the other scenarios in the gizmodo column seem... dubious to me.
 

glasspusher

Member
SoSH Member
Jul 20, 2005
9,973
Oakland California
stupid question for those of us who haven't been paying attention for the last 10 years or so- is Intel's x86 still considered a RISC chip with a CISC front end, or am I way off?
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
stupid question for those of us who haven't been paying attention for the last 10 years or so- is Intel's x86 still considered a RISC chip with a CISC front end, or am I way off?
Yes. Everything “normal” is reduced to micro-ops (u-ops) and handled more or less like classical RISC. There is special case handling in microcode for some of the more esoteric x86 instructions.

When stored, the x86 code is actually denser than ARM (or most RISC), so it needs less I-fetch bandwidth, but it needs more area spent on decoders because the instructions are harder to interpret (or even align, in some cases).
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
did we just become friends?


Intel's 10nm is roughly equivalent to Samsung/TSMC 7nm in transistor density. Can't remember where I read it but TSMC claims their 7nm roll out is on schedule for 2018. Meanwhile Intel's 10nm rollout has been pushed back into 2018. Given recent track records first to market is unclear

For years Intel tick-tocked their way into a process advantage. It seems to be gone now and they need to get back up to speed or it will start impacting the bottom line
I sorta hate Quora for most things but this answer I saw today roughly matches what you say (in terms of GF and Intel's marketing terminology, maybe not covering TSMC but I suspect things are similar there too):

https://www.quora.com/Why-is-Intel-stuck-at-14nm-and-behind-in-10nm-though-GloFlo-is-readying-7nm-technology-for-Ryzen
 

jimv

Member
SoSH Member
Feb 5, 2011
1,122
Thanks for bumping the thread!

Obscured by their fantastic1st quarter results Intel announced that their 10nm process will be delayed until 2019. Yields are still unacceptably low apparently.

Meanwhile TSMC has started hvm of its 7nm chips.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
After owning an iPad Air 1, I caved and bought the iPad Pro today. That machine seems like a beast and the second step into replacing Intel for laptops. I'm planning on leaving my MacBook Pro at home now on for work 90% of the time and taking this.
 

Papelbon's Poutine

Homeland Security
SoSH Member
Dec 4, 2005
19,615
Portsmouth, NH
I played around with a coworkers recently and yeah, they seem like a monster. I just can’t justify it, as much as I’d rather work with that than my personal mini or my work issued, one year old, iPad. I feel like I’d get used to the size difference to get the benefit.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
I just got mine yesterday (11”) with the Smart Keyboard case and this may be the best device I’ve owned. The keyboard is actually really good. I got it, it seemed not great, but I love the keyboard., I can actually type full speed on it.

The iPad itself is extremely fast and the screen is amazing. I love the new look and the older iPad style looks extremely dated after using this for a day. I hope Apple continues to improve iOS to take advantage of the system.
 

wade boggs chicken dinner

Member
SoSH Member
Mar 26, 2005
32,663
I played around with a coworkers recently and yeah, they seem like a monster. I just can’t justify it, as much as I’d rather work with that than my personal mini or my work issued, one year old, iPad. I feel like I’d get used to the size difference to get the benefit.
The chip inside the new IPad Pro - the A12X - is a monster and it's apparently faster than 92% of all chips out there, including most Intel I7 iterations. That's pretty impressive.

Just for reference, Microsoft has been trying to put ARM chips in Windows 10 computers promoting incredible battery life and being "Always connected" to the internet. The first models, based on Snapdragon 835, didn't go very well as being way too underpowered for Windows 10.

The next generation is based on Snapdragon 850, which is an exclusive Windows chip. Better but not there yet. It's about 1/2 as powerful as the A12 in the new IPhones (which in turn is about 5-10% slower than the A12X in the IPad Pro).

Qualcomm is prepping a new chip just for Windows 10 PC right now named the 8180. It's going be a 14nm chip with 8 cores so it should be really powerful but it will be interesting to see how much battery life it will have.

Apple deserves kudos for its chip design.
 

Max Power

thai good. you like shirt?
SoSH Member
Jul 20, 2005
8,561
Boston, MA
The chip inside the new IPad Pro - the A12X - is a monster and it's apparently faster than 92% of all chips out there, including most Intel I7 iterations. That's pretty impressive.

Just for reference, Microsoft has been trying to put ARM chips in Windows 10 computers promoting incredible battery life and being "Always connected" to the internet. The first models, based on Snapdragon 835, didn't go very well as being way too underpowered for Windows 10.

The next generation is based on Snapdragon 850, which is an exclusive Windows chip. Better but not there yet. It's about 1/2 as powerful as the A12 in the new IPhones (which in turn is about 5-10% slower than the A12X in the IPad Pro).

Qualcomm is prepping a new chip just for Windows 10 PC right now named the 8180. It's going be a 14nm chip with 8 cores so it should be really powerful but it will be interesting to see how much battery life it will have.

Apple deserves kudos for its chip design.
I got a HP Envy x2 Windows on ARM computers second hand last week and it's exceeded my expectations. The Snapdragon 835 isn't super fast, but I haven't felt like it's frustratingly slow doing anything. Instant on with Windows Hello face recognition means it's always ready as soon as I want to use it. The battery life is incredible. I've charged it twice since I got it last Wednesday and that included an OS upgrade from 1803 to 1809. The keyboard and touchpad are extremely high quality, too.

This is the future of Windows PCs if they can get x64 app compatibility working and a bit of an overall speed boost.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
I completely agree. I’m fully expect that Apple has a full MacOS X running on ARM and probably a iOS Pro type OS. After a week with my iPad Pro, I find myself using my MacBook Pro less and less. The reality is that most of my day to day work at this point is Email, Chat, PowerPoint etc. With the USB-C to HDMI adapter, I save my computer for real work and bring my iPad to customers etc. There’s some real limitations but not carrying a laptop bag and adapters and power cords etc. is awesome.
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
I got a HP Envy x2 Windows on ARM computers second hand last week and it's exceeded my expectations. The Snapdragon 835 isn't super fast, but I haven't felt like it's frustratingly slow doing anything. Instant on with Windows Hello face recognition means it's always ready as soon as I want to use it. The battery life is incredible. I've charged it twice since I got it last Wednesday and that included an OS upgrade from 1803 to 1809. The keyboard and touchpad are extremely high quality, too.

This is the future of Windows PCs if they can get x64 app compatibility working and a bit of an overall speed boost.
That x64 compatibility is a big IF - many people have tried to do things like this and found the performance wanting (these days you might be able to get the performance where you want but the battery life would take a hit).
 

wade boggs chicken dinner

Member
SoSH Member
Mar 26, 2005
32,663
Not to turn this into a Windows thread (as I'm sure I'd be reported if I did that) . . . .

I got a HP Envy x2 Windows on ARM computers second hand last week and it's exceeded my expectations. The Snapdragon 835 isn't super fast, but I haven't felt like it's frustratingly slow doing anything. Instant on with Windows Hello face recognition means it's always ready as soon as I want to use it. The battery life is incredible. I've charged it twice since I got it last Wednesday and that included an OS upgrade from 1803 to 1809. The keyboard and touchpad are extremely high quality, too.

This is the future of Windows PCs if they can get x64 app compatibility working and a bit of an overall speed boost.
You are probably the only person on SOSH who owns one of these. Congrats! However, you might feel differently if you had paid list price for the machine (something like $900+ if I recall correctly).

That x64 compatibility is a big IF - many people have tried to do things like this and found the performance wanting (these days you might be able to get the performance where you want but the battery life would take a hit).
MSFT just announced that they are supporting 64-bit apps for Windows 10 ARM machines: https://www.theverge.com/2018/11/16/18098230/microsoft-windows-on-arm-64-bit-app-support-arm64. And from that article: "ARM is also promising laptop-level performance from its Cortex-A76 chip design in 2019. The chip design company has been claiming that ARM processors next year will compete with Intel’s Kaby Lake range on laptops."

I should probably sell my Intel stock now.
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
MSFT just announced that they are supporting 64-bit apps for Windows 10 ARM machines: https://www.theverge.com/2018/11/16/18098230/microsoft-windows-on-arm-64-bit-app-support-arm64. And from that article: "ARM is also promising laptop-level performance from its Cortex-A76 chip design in 2019. The chip design company has been claiming that ARM processors next year will compete with Intel’s Kaby Lake range on laptops."

I should probably sell my Intel stock now.
I missed that announcement, it's interesting. But, it's still different than support for legacy x64 binaries, which is what I thought was being talked about (and is sort of the holy grail, though pretty difficult to achieve). Certainly a "fat binary" approach like what Apple used during their migration from Moto to PPC would make things a lot more palatable, but it won't solve issues for people who have old software that either can't or won't get updated to ARM.

It will be interesting to see if ARM can hit those perf targets with the A76. Historically ARM's architecture has had a hard time reaching the same performance peaks as x86, and especially for single-threaded workloads which are what most people care about / perceive in terms of responsiveness of a machine. (Conversely, Intel had a really hard time scaling down the energy-efficiency of x86 for embedded/mobile platforms.)

I'm also curious to see if the ARM64 instruction set encoding makes things less power-efficient, need to go find some papers now.
 

Max Power

thai good. you like shirt?
SoSH Member
Jul 20, 2005
8,561
Boston, MA
Not to turn this into a Windows thread (as I'm sure I'd be reported if I did that) . . . .
I think running desktop OSes on mobile chips is close enough to the topic to apply.

You are probably the only person on SOSH who owns one of these. Congrats! However, you might feel differently if you had paid list price for the machine (something like $900+ if I recall correctly).
I'm sure I'm one of the few even outside SoSH. I got it for $450 in like new condition. Amazon has them for around $600, which is a fair deal. Compare that to an iPad with a keyboard and pen and it looks even better.
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
Amazon announced availability of ARM-based AWS instances the other day, using their own chip built by the Annapurna team that they acquired ~3 years ago. The benchmark results are all over the place - FP looks especially good, raytracer results are great, website serving results are meh at best - 16 cores of ARM don't keep up with 5 cores of Xeon E5-2697 v4.

The instances are cheap to deploy though (as low as $0.0255/hr) - probably this is a combination of cost-to-acquire/build and cost-to-operate for Amazon, but I'm just guessing.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
I love the new Mac Pro is finally the old Mac Pro design. The cheese grater Mac Pro was awesome.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
Yup, it's finally back to being Pro again. The new Retina 6K monitors look good too.
The LOL moment when they said the Pro Display XDR stand was $999 was great. Only Apple can charge $1k for a stand. He was caught off guard when everyone laughed at him and stuttered.
 

ifmanis5

Member
SoSH Member
Sep 29, 2007
65,840
Rotten Apple
The LOL moment when they said the Pro Display XDR stand was $999 was great. Only Apple can charge $1k for a stand. He was caught off guard when everyone laughed at him and stuttered.
Yup. And getting roasted on Twitter for it.

All my video editing peeps are excited about the new Pro. Apple marketing the monitor (which is nice) as a professional reference monitor is ridiculous, though.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
Yup. And getting roasted on Twitter for it.

All my video editing peeps are excited about the new Pro. Apple marketing the monitor (which is nice) as a professional reference monitor is ridiculous, though.
Outside of the joke of the $999 stand, can you share any light on the actual display? Obviously they went head to head with a $30k plus Sony display. Is this monitor actually any better / worse? I was wondering who actually makes the panel, which i'm sure we'll soon find out.
 

ifmanis5

Member
SoSH Member
Sep 29, 2007
65,840
Rotten Apple
Outside of the joke of the $999 stand, can you share any light on the actual display? Obviously they went head to head with a $30k plus Sony display. Is this monitor actually any better / worse? I was wondering who actually makes the panel, which i'm sure we'll soon find out.
Can't really say much about the new monitor, (which again, is great and long overdue) but to compare it with a pro reference monitor which has a lot more pro features (like built in broadcast scopes and meters, LUT support, support for various pro color spaces, knobs to adjust color temps and gamma curves, the list goes on...), is built to much tighter specifications and is serviced by a pro technician with a spectrometer, is not a fair comp.
Here's a link to a great all-around pro reference monitor: http://flandersscientific.com/AM550/
Most users for the Apple monitor don't need that stuff and if they do, they'll get the Flanders, or a DreamCast or a DolbyVision instead.
 

canderson

Mr. Brightside
SoSH Member
Jul 16, 2005
40,934
Harrisburg, Pa.
Biggest news today is HomeKit integration with surveillance cameras - no charge if you have an iCloud subscription.

I love Nest but I’ll drop them in a flash if Arlo’s next cameras are good.

They also made the iPad Pro usable ...
 

dirtynine

Member
SoSH Member
Dec 17, 2002
8,779
Philly
I watched one of the “Whole presentation in 10 minutes” recap videos, and saw that the watch is allowing devs to access the audio streaming api - hopefully this means Spotify over cellular is on the way.
 

Gorton Fisherman

Well-Known Member
Lifetime Member
SoSH Member
May 26, 2002
2,493
Port Orange, FL
Biggest news today is HomeKit integration with surveillance cameras - no charge if you have an iCloud subscription.

I love Nest but I’ll drop them in a flash if Arlo’s next cameras are good.
Just curious, whatsa matter with the current line of Arlo cameras? (Sorry, I'm a total home security newb, just starting to research this stuff, and am very interested in possibly putting in a ground-up HomeKit-compatible security system at the house, especially in light of yesterday's WWDC announcement re: cameras.)
 

canderson

Mr. Brightside
SoSH Member
Jul 16, 2005
40,934
Harrisburg, Pa.
Just curious, whatsa matter with the current line of Arlo cameras? (Sorry, I'm a total home security newb, just starting to research this stuff, and am very interested in possibly putting in a ground-up HomeKit-compatible security system at the house, especially in light of yesterday's WWDC announcement re: cameras.)
Oh, I didn't meant that as a knock on Arlo. Just I have several Nest cams and there isn't any current reason to dump them for a competitor. This could change that.
 

NortheasternPJ

Member
SoSH Member
Nov 16, 2004
20,079
We'll have to see real world numbers, but these new Macs seem ummm fast? The GPU numbers alone are great.

20 hours of video playback in a 13" Macbook Pro? I think it's pretty clear why Apple's ditching Intel.
 

cgori

Member
SoSH Member
Oct 2, 2004
4,249
SF, CA
We'll have to see real world numbers, but these new Macs seem ummm fast? The GPU numbers alone are great.

20 hours of video playback in a 13" Macbook Pro? I think it's pretty clear why Apple's ditching Intel.
It doesn't surprise me that either the CPU (ARM vs x86) or GPU (in-house vs previous imagination-derived design) is more power efficient, not at all. The real-world / independent benchmarks will be interesting, as the footnotes to some of the perf claims seems to have typical marketing-weasel language in them to me.

I'm not tracking this at all today, was there any discussion of "fat binary" or some other way to manage compatibility during the CPU switchover? I see something about Rosetta 2 so emulation is the path?
 
Last edited:

SumnerH

Malt Liquor Picker
Dope
SoSH Member
Jul 18, 2005
32,712
Asheville, NC
I'm not tracking this at all today, was there any discussion of "fat binary" or some other way to manage compatibility during the CPU switchover? I see something about Rosetta 2 so emulation is the path?
Emulation is the path. Preliminary reports are positive for the emulation performance, but real-world results remain to be seen.
 

soxhop411

news aggravator
SoSH Member
Dec 4, 2009
47,802
Looks like the MBP redesign will include much more than just an M1 chip. It include the possible return of the MAGSAFE charger
Following today's report from analyst Ming-Chi Kuooutlining major changes for the next-generation MacBook Pro models coming in the third quarter of this year, Bloomberg's Mark Gurman has weighed in with his own report corroborating some of the details but seemingly differing a bit on others.
First, Gurman shares more details on the return of MagSafe charging to the MacBook Pro, indicating that it will indeed be a similar design to the previous incarnation of ‌MagSafe‌ on Mac notebooks. Gurman also says the shift back to dedicated ‌MagSafe‌ charging will allow for faster charging speeds.

Gurman says the new MacBook Pro models will unsurprisingly continue to support USB-C, with two USB-C ports located next to the ‌MagSafe‌ port, and presumably two more USB-C ports on the other side of the machine.

Bloomberg's report also offers a tidbit about the displays on the upcoming MacBook Pro models, indicating they will use "brighter, higher-contrast panels." The machines will also of course come equipped with Apple silicon chips offering more processing cores and improved graphics compared to the M1 found in the first batch of Apple silicon Macs.

As for the design of the new MacBook Pro models, Kuo had indicated that they would receive an iPhone12-style redesign with flat edges, but Gurman seems to downplay the significance of any changes, indicating that they will "look similar" to the current models but with "minor design changes."

Gurman also says that Apple has "tested" versions of the MacBook Pro without the Touch Bar, while Kuo seems more definitive that the controversial feature will be removed in the final design.
https://www.macrumors.com/2021/01/15/macbook-pro-better-displays-faster-magsafe-charging/
 

voidfunkt

Member
SoSH Member
Apr 14, 2006
1,519
/dev/null
It include the possible return of the MAGSAFE charger
I'm not really into Apple hardware, but the Magsafe was hands down one of the best ideas ever. I'd argue, it was probably the best hardware decision Apple ever made beyond the switch from PPC to Intel. Removing it was a horrible move.