Customize Consent Preferences

We use cookies to help you navigate efficiently and perform certain functions. You will find detailed information about all cookies under each consent category below.

The cookies that are categorized as "Necessary" are stored on your browser as they are essential for enabling the basic functionalities of the site. ... 

Always Active

Necessary cookies are required to enable the basic features of this site, such as providing secure log-in or adjusting your consent preferences. These cookies do not store any personally identifiable data.

No cookies to display.

Functional cookies help perform certain functionalities like sharing the content of the website on social media platforms, collecting feedback, and other third-party features.

No cookies to display.

Analytical cookies are used to understand how visitors interact with the website. These cookies help provide information on metrics such as the number of visitors, bounce rate, traffic source, etc.

No cookies to display.

Performance cookies are used to understand and analyze the key performance indexes of the website which helps in delivering a better user experience for the visitors.

No cookies to display.

Advertisement cookies are used to provide visitors with customized advertisements based on the pages you visited previously and to analyze the effectiveness of the ad campaigns.

No cookies to display.

Empowering you to understand your world
Samsung Exynos 9 chip

Samsung Phone Processor Sports On-Chip AI

Artificial intelligence implementations often run on a powerful server and then client devices such as Alexa, Google Home, and mobile apps that utilize them send commands and requests to that server.

This is done on the premise that client devices such as phones and home assistants have limited processing power, and should therefore just have a server handle that for them. That concept has worked, however it has its challenges and there is always room for improvement.

If you use a home assistant, what you say to it is stored on the respective service provider’s servers and many people appreciate their privacy. Another issue with this AI cloud concept is that every request is going to use bandwidth or mobile data.

Samsung’s new Exynos 9820 phone SoC (System on Chip) incorporates an NPU for on-chip AI acceleration. This means that phones will be able to execute at least some AI-related functions themselves instead of just passing everything through a cloud server.

AI-related applications do require a large amount of processing power, but said amounts of power are now attainable by fairly small chips due to various technological advancements, most notably improved processor designs.

This can provide a substantial performance improvement because it won’t be slowed down by the user’s Internet connection (unless it is still partially reliant on a server). Any Internet connection is very slow compared to your phone’s internal data transfer rates (the rate at which it can transfer data between RAM and the CPU, for example).

In addition to that, that Exynos 9 chip delivers a 20% performance improvement in single-core processing, or a 40% power efficiency improvement. It also supports 2.0 Gbps broadband, which is incredibly fast if you have an Internet connection to match that.

Share this article
Shareable URL
Prev Post

LaCie Launches 2TB USB-C SSD With 500MB/s Speeds

Next Post

Rivian Truck Travels 400 Miles Per Charge, Accelerates From 0-60 In 3 Seconds

Leave a Reply

Read next
Subscribe to our newsletter
Get notified when new content is published