deciphered

dynamic

part II, the A12 is not just a speed bump


Apple made an unprecedented string of hardware announcements this past week, including iMacs, iPads and AirPods.  All of Apple’s iPads now sport and A12 or A12X.

For the iPad Mini, the A12 is a significant jump in processing power compared to the A8 of the 4th generation Mini.  The iPad Mini’s page on the Apple website states that it is “up to 3x faster …”.   In terms of where content is viewed the Apple TV is the outlier with its A10X.  What does the A12 bring to the equation?  What does it bring in addition to much improved processing power? Were the iPad introductions getting them out of the way or were they in preparation for the streaming announcement?

The working hypothesis is that the A12 is important to the streaming offering and it will appear in the AppleTV.  Why would I think this?  Is tomorrow’s event not solely about services?

I want to return to a comment from Steve Jobs.  In 2008 he stated that Apple acquired PA Semi to further differentiate Apple products through semiconductor design.  I think this holds true here, for the move into video streaming.

As noted in Part I most are simply talking content.  Some are talking about bundling and some about the creation of a content marketplace.  Others have mentioned a possible video game subscription service.   I see the ability to identify users as an important aspect of many of these services.   Apple already does this with their various biometric approaches and the Secure Enclave of the A-series processors.  I thought about this in 2016 when the T1 chip was included in the MacBook Pro.  A TV remote with secure functionality and a second screen might prove interesting.  Of course the Secure Enclave is on the A12. 

If we go back to the September introduction of the A12 we were told about the “console level graphics”.  This may prove important to both the video streaming and game streaming services.  This level of graphics was not possible with the A10X of the current Apple TV or the A8 of the iPad Mini 4.  Again, it is about having these compute capabilities whether one is using an iDevice or a TV for any streaming service.

Another circuit block of interest is the Neural Engine.  It is not present on either the A8 or A10X.  We know that this circuit block is central to Apple’s AI and AR efforts.  This brings us back to Gene Munster’s comments in Part I.

There will be other relevant blocks or capabilities there, but these three are prominent in Apple’s marketing.  Thus, I think there will be, at least, an A12 in the iDevices associated with the consumption of the streaming content. 

Pushing further, the current Apple TV page discusses the use of an iPhone as a remote.  I think one could move to the next level.  What if an iPhone or iPad were to be used as a TV remote, game controller and AR interface with the streaming content.  One could take pictures of oneself or interact with content that is being streamed.  That might be interesting and would differentiate any streaming service.

In the end I do not have any beforehand knowledge from fireside chats in Cupertino. But, I think there can be some differentiating features brought to any streaming service through Apple hardware, more particularly Apple semiconductor efforts .  To this end, I see considerable utility in using an iPad Mini or iPhone as a remote and means for interacting with any streaming content.  Time will tell if Apple has a similar vision for differentiation or if they stick to simple content.