As demand for artificial intelligence (AI) increases, chip makers strive to create more powerful and more efficient processors. The goal is to accommodate the requirements of neural networks with better and cheaper solutions, while staying flexible enough to handle evolving algorithms. At Hotchips 2017, many new deep learning and AI technologies were unveiled, showing the different approaches of leading tech firms as well as budding startups. Check out this EETimes survey of Hotchips for a good summary of the event focused on chips for AI data centers.
Read the full article on Embedded.
You might also like
More from Deep Learning
Bringing Power Efficiency to TinyML, ML-DSP and Deep Learning Workloads
In recent times, the need for real-time decision making, reduced data throughput, and privacy concerns, has moved a substantial portion …
Human Presence Detection and You
Mobile phones and tablets are getting more powerful, but if you’re serious about doing work (remotely), the dedicated work laptop …
Meeting the challenges of edge AI
AI is becoming an increasingly popular technology, finding uses in more and more applications in sectors such as automotive, vision …