[ad_1]
(Reuters) – Amazon.com said Thursday it switched some of its Alexa voice assistant computing to its own custom-designed chips, with the goal of making the job faster and cheaper while moving it away from Nvidia-supplied chips.
When users of devices like Amazon’s Echo line of smart speakers ask the voice assistant a question, the query is sent to one of Amazon’s data centers for various processing steps. When Amazon computers spit out a response, that response is in a text format that must be translated into audible speech for the voice assistant.
Amazon previously handled that computing using Nvidia chips, but now “most” will occur using its own Inferentia computing chip. First announced in 2018, Amazon’s chip is custom designed to speed up large volumes of machine learning tasks, such as translating text to speech or recognizing images.
Cloud computing customers such as Amazon, Microsoft, and Alpahbet’s Google have become some of the largest buyers of computing chips, driving booming data center sales at Intel Corp INTC.O, Nvidia and others.
But major tech companies are increasingly abandoning traditional silicon suppliers to design their own chips. Apple on Tuesday introduced its first Mac computers with its own core processors, moving away from Intel chips.
Amazon said the switch to the Infertia chip for some of its Alexa work has resulted in 25% better latency, which is a measure of speed, at a 30% lower cost.
Amazon has also said that “Rekognition,” its cloud-based facial recognition service, has started adopting its own Inferentia chips. However, the company did not say which chips the facial recognition service had previously used or how much of the work had been transferred to its own chips.
The service has come under scrutiny by civil rights groups due to its use by law enforcement. In June, Amazon imposed a one-year moratorium on its use by police after George Floyd’s murder.
(Reporting by Stephen Nellis in San Francisco; Edited by Tom Brown)