Home / Software & Service News / Microsoft built a hardware platform for real-time AI

Microsoft built a hardware platform for real-time AI

In many cases, you want AI to work with info as it happens. That virtual assistant needs to respond within a few seconds at most, and a smart security camera needs to send an alert while intruders are still within sight. Microsoft knows this very well. It just unveiled its own hardware acceleration platform, Project Brainwave, that promises speedy, real-time AI in the cloud. Thanks to Intel’s new Stratix 10 field programmable gate array (FPGA) chip, it can crunch a hefty 39.5 teraflops in machine learning tasks with less than 1 millisecond of latency, and without having to batch tasks together. It can handle complex AI tasks as they’re received, in other words.

It’s considerably more flexible than many of its hard-coded rivals, too. It relies on a ‘soft’ dynamic neural network processing engine dropped into off-the-shelf FPGA chips where competitors often need their approach locked in from the outset. It can handle Microsoft’s own AI framework (Cognitive Toolkit), but it can also work with Google’s TensorFlow and other systems. You can build a machine learning system the way you like and expect it to run in real-time, instead of letting the hardware dictate your methods.

To no one’s surprise, Microsoft plans to make Project Brainwave available through its own Azure cloud services (it’s been big on advanced tech in Azure as of late) so that companies can make use of live AI. There’s no guarantee it will receive wide adoption, but it’s evident that Microsoft doesn’t want to cede any ground to Google, Facebook and others that are making a big deal of internet-delivered AI. It’s betting that companies will gladly flock to Azure if they know they have more control over how their AI runs.

Via: VentureBeat

Source: Microsoft Research Blog, Intel Newsroom

Click Here For Original Source Of The Article

About Ms. A. C. Kennedy

Ms. A. C. Kennedy
My name is Ms A C Kennedy and I am a Health practitioner and Consultant by day and a serial blogger by night. I luv family, life and learning new things. I especially luv learning how to improve my business. I also luv helping and sharing my information with others. Don't forget to ask me anything!

Check Also

Microsoft’s Seeing AI app for the blind now reads handwriting

Artificial intelligence took center stage at Microsoft's AI Summit in San Francisco on Wednesday. Aside from announcing AI smarts for a range of software -- from Bing to Office 365 -- the tech titan is also ramping up its Seeing AI app for iOS, which uses computer vision to audibly help blind and visually impaired people to see the world around them. According to Microsoft, it's nabbed 100,000 downloads since its launch in the US earlier this year, which convinced the tech titan to bring it to 35 countries in total, including the EU.

It's also getting a bunch of new features. The app now boasts more currency recognition, adding British pounds, US dollars, Canadian dollars, and Euros to its tally. Going beyond the color in a scene, it can also spot the color of specific objects, like clothes. Plus, it's no longer restricted to just short printed text, with handwriting recognition now part of its skill set. You can also customize the voice that it uses to speak its observations out loud, and set how fast it talks.

Finally, a musical light detector alerts you to the light in an environment with an audible tone -- Microsoft claims the tool will save users from having to touch a hot bulb or LED battery to check if it's on. Despite the big update, there's still no word on an Android launch.

Source: Microsoft

css.php