Intel’s New Ice Lake Processors Feature AI-on-a-Chip but What Does That Change for Bots?

intel, processor, AI

Mobile processors have had AI features for the last generation or two, but Intel has finally caught up, bringing on-chip AI to upcoming models of notebook and desktop. What does that mean for chatbots, AI services and developers?

The processor wars got really boring after the fun of the Intel Pentium vs AMD Athlon battle of the 90s. Since then, notebook processor development has been about regularly shrinking the core to smaller nanometer numbers to gain more speed and power. The key benefit was that they could run cooler, allowing the thin-and-light sexy, quieter, notebook designs of today, with better battery life. But with fewer noticeable end results, apart from less back strain or burning thighs, for the typical user editing documents, typing emails or hanging out on social media.

You can run Windows 10 and the usual apps on a 10-year old PC quite happily, only game players (who are more GPU reliant), big number crunchers, video editors and rendering artists really need the sheer power that recent processors can produce.

A smarter display of power

Enter the new Ice Lake processors, coming to notebooks by the winter, that cram even more transistors onto the chip thanks to the 10 nanometer design. Like Intel’s current 9th generation chips, the high-end versions will have four cores and eight threads for blistering multitasking. While that offers the usual benefits of faster/cooler/lighter, Intel is cramming faster WiFi, Thunderbolt 3 connection support and other features for a better user experience. On the AI side, Intel promises, “Laptops with 10th Gen Intel Core processors learn and adapt to what you do, leveraging the benefits of built-in AI instructions to help you get things done faster and more fluidly.”

Apple and Samsung have had AI components on their recent mobile chips, so to see where Intel’s army of developers can go with this, let’s see what’s been happening on the mobile front. Much of the effort has been on taking better photos with those zoom-limited smartphone cameras. Apple’s AI team use the new A11 chip’s AI smarts to create those cool portrait mode shots and for FaceID recognition. Samsung use theirs to put AI-agent Bixby in phones and other devices like fridges, smart TVs and other devices.

How will developers use on-chip AI?

Yet, Intel’s chips are focused only on notebooks, so none of those use cases really apply. Confusing the water is Intel’s new range of AI-dedicated chips, called Pohoiki Beach which mimic the human brain and could fit anywhere from artificial limbs to games devices for a range of uses. What that smacks of to us is Intel’s chips are a solution looking for problems to solve, something for crafty third-party developers to design and change the world.

To help them along one feature is Intel’s Deep Learning Boost helping coders access the hardware for complex AI workloads with dedicated instruction set to speed up neural networks. Backing that up is a teraflop of GPU engine compute power for sustained high-throughput inference applications (GPUs are often used for AI applications due to their blazing speed).

Also on-board is Intel’s Gaussian & Neural Accelerator (GNA) for background workloads, which is where traditional AI and chatbot developers could take an interest. This will boost features like voice processing to help with voice-to-text and text-to-voice services on the device, for faster translations and communication.

Since Microsoft’s voice assistant Cortana is quietly being sidelined, there’s a great opportunity for voice assistants to be built that seamlessly move from notebook to mobile, using chatbot or voice mode as required and encouraging new levels of cross-linking and service.

That could mean your “main” bot talks to the bank bot who wants to remind you about an upcoming manual payment, while your local sports bot could chime in with news of some unsold tickets for your favorite team, with your doctor’s bot reminding you of that appointment for later in the week.

Linking all those bots into one requires trust (think blockchain) and AI that is smart enough to recognize essential events, casual reminders and useful opportunities from across a myriad of sources. That’s where notebooks and mobiles can work together beyond everything being tied into your usual Google account.

Then there are the true novel use cases that we haven’t even thought about yet, with developers using AI to write our emails and documents for us (using our personal style of language and quirks), accelerating those simple suggestions we see now on gmail and web search suggestions.


Bots could also represent us, when we’re not available, making sure suggested meetings don’t clash, and responding to questions with definitive answers based on our deeper desires. “Yes, I’d love to go the opera!” or “No, I’m not going to grandma’s this weekend!”

Yes, this does mean the AI will be listening in more, creating potential privacy issues. It also means we’ll move away from the term “AI” as it begins to better represent us, with “AIvatar” or similar being more consumer friendly, since the term will become meaningless due to the wide range of uses.

The true quest for AI

The arrival of AI in hardware will create big changes for computing and us fleshy users, with chabots, virtual agents and other forms of interaction helping lead the way. The changes will come at us quickly as developers with some crazy or brilliant new ideas try to steal the market.

Ai on a notebook where workers use it for 8 hours a day could see different types of monitoring, beyond the usual productivity checks. An app could monitor for signs of stress, encourage people to take a break, align types of work for people’s creativity levels and so on. At home, Ai could monitor mental state with browsing habits and send out alerts to parents or guardians for minors or vulnerable users. There are many possibilities, all of which involve us trusting our AI more, which is a key challenge for any app or service creators, especially in light of Facebook’s roughshod approach to privacy and user rights.

That means plenty of hype, lots of failures and some successes that will truly change the way how we interact with technology. It is certainly an exciting time, but its important to remember that the technology needs to serve the user, no matter how far ahead it gets from those first faltering chatbots.


One response to “Intel’s New Ice Lake Processors Feature AI-on-a-Chip but What Does That Change for Bots?”

Leave a Reply

Your email address will not be published. Required fields are marked *