[ad_1]
Apple already uses artificial intelligence to help with iPhone image processing through the neural engine that has been part of the A-series chips since iPhone 8 and iPhone X. But future iPhones may see some artificial intelligence processing. made directly by the camera sensor …
The edge It reports that Sony, which supplies the advanced image sensors used in today’s iPhones, is working on integrating smart intelligence directly into the sensor. It has already announced the first of these, although it is a model oriented to commercial applications instead of consumer applications.
Sony has announced the world’s first image sensor with embedded artificial intelligence. The new IMX500 sensor incorporates processing and memory power, allowing you to perform machine learning tasks with machine learning without additional hardware. The result, Sony says, will be faster, cheaper, and more secure AI cameras. […]
Many applications rely on sending images and videos to the cloud for analysis. This can be a slow and insecure journey, exposing data to hackers. In other scenarios, manufacturers have to install specialized device processing cores to handle additional computing demand, as is the case with new high-end phones from Apple, Google and Huawei.
But Sony says its new image sensor offers a more streamlined solution than either of these approaches.
The IMX500 is for retail and industrial uses, such as facial recognition to support stores without payment, but the company is clearly considering smartphone apps.
It is not clear if Apple would take advantage of this type of sensor. The company puts a lot of work into its own image-processing algorithms, but it is possible that future iPhones may incorporate sensors that do some AI work and then move on to the A-series chip for further processing, which could both increase the performance as efficiency.
FTC: We use automatic affiliate links that generate income. Plus.
Check out 9to5Mac on YouTube for more Apple news: