What makes consumer electronics so amazing is that it only takes a few years for technology that was previously only accessible to organizations with tons of money to make its way to something as accessible as a smartphone. A perfect example of this is machine learning, which has been used on a large scale by governments and technology companies for years, but is now making its way into Android devices with help from Qualcomm’s advanced Snapdragon chips.
Recent advancements in cognitive technologies—particularly machine learning—have propelled numerous innovations in areas such as image and facial recognition, natural language processing, and big data mining. IBM’s Watson helping fight diseases, Facebook’s automatic photo tagging, and the voice recognition you use with Apple’s Siri or Google Now are great examples of the benefits that machine learning techniques can deliver. But for the most part, the intense compute requirements of machine learning restrict its implementation to large data centers. A typical computing environment for deep learning needs teraflops of compute power, tens of gigabytes of RAM memory, and hundreds of watts of electric power—nothing like the smartphone, tablet or PC you are using to read this blog. Running a deep learning algorithm in a mobile device is not short of obstacles. The implementation not only has to consider constraints in compute speed, memory and power compared to its server relatives, but also restrictions derived from thermal limits.