Home / Software & Service News / Clarifai launches SDK for training AI on your iPhone

Clarifai launches SDK for training AI on your iPhone

Computer vision startup Clarifai has launched a mobile software development kit (SDK) to process and carry out artificial intelligence on iOS devices. An Android version is also in the works.

The news was announced onstage today by Clarifai founder Matt Zeiler at MB 2017, a gathering of AI and bot industry innovators being held at Fort Mason in San Francisco, California.

The news is significant because it allows mobile users to carry out AI computations — even on their iPhone without a connection to the cloud, which is typically how machine learning is handled on mobile devices today. Computer vision that requires no internet connection will be useful for mobile app developers and users in parts of the world where internet speed or connections aren’t reliable.

“This network effect of all the users and all the devices users are using connected to our cloud, that’s going to be the distributed training infrastructure, which is really exciting because when you think about it, the number of servers that even the biggest companies like Google have that is very tiny compared to the number of mobile devices on the planet. And so we want to leverage that to get outside of the data center and really tap into the knowledge of everybody on the planet,” Zeiler said.

Though no internet connection is necessary for the SDK to operate, use of Clarifai in the cloud will allow a user to synchronize or share their AI trained to recognize specific objects, faces, or pets.

Clarifai hopes its computer vision models combined with the mobile SDK will be adopted to tackle use cases in many verticals, but one early partner already using the mobile SDK is medical imaging company i-Nside. The company has built a device that fits on the back of smartphones that looks inside people’s ears to detect specific diseases. Images taken by the company over the years have been used to train AI models, and now the SDK will allow doctors and medical professionals to process images and check for disease without potential privacy violations or the need for a data center or internet connection.

The SDK can be used not just for running AI models but also for training them. Clarifai launched its custom training and image search service last fall. The service can also go beyond labeling faces or objects for a personalized experience on your iOS device.

“Image recognition is often talked about in terms of object recognition, like ‘this is a chair and a table and a dog,’ but it doesn’t have to be something like an object, it can be something like a preference or an intent and the mobile SDK can actually learn that on each device,” he said. “So for example if I share dog pictures with my wife all the time, hitting share with Lisa can be learned and so the next time I have a dog picture and I hit the share button, it recommends Lisa as the top person to share with.”

Clarifai launched in 2013, prior to being named winner of the ImageNet contest, an annual competition to recognize 1,000 categories of images with computer vision. In 2015 Clarifai expanded its visual analysis beyond images to include video. The highly sought-after startup has turned more than a dozen acquisition attempts by tech giants, Zeiler recently told Reuters.

Last fall, Claifai raised $30 million to grow its team in a round led by Menlo Ventures with participation from Union Square Ventures, Qualcomm, and Lux Capital.

More to come

Click Here For Original Source Of The Article

About Ms. A. C. Kennedy

Ms. A. C. Kennedy
My name is Ms A C Kennedy and I am a Health practitioner and Consultant by day and a serial blogger by night. I luv family, life and learning new things. I especially luv learning how to improve my business. I also luv helping and sharing my information with others. Don't forget to ask me anything!

Check Also

Microsoft’s Seeing AI app for the blind now reads handwriting

Artificial intelligence took center stage at Microsoft's AI Summit in San Francisco on Wednesday. Aside from announcing AI smarts for a range of software -- from Bing to Office 365 -- the tech titan is also ramping up its Seeing AI app for iOS, which uses computer vision to audibly help blind and visually impaired people to see the world around them. According to Microsoft, it's nabbed 100,000 downloads since its launch in the US earlier this year, which convinced the tech titan to bring it to 35 countries in total, including the EU.

It's also getting a bunch of new features. The app now boasts more currency recognition, adding British pounds, US dollars, Canadian dollars, and Euros to its tally. Going beyond the color in a scene, it can also spot the color of specific objects, like clothes. Plus, it's no longer restricted to just short printed text, with handwriting recognition now part of its skill set. You can also customize the voice that it uses to speak its observations out loud, and set how fast it talks.

Finally, a musical light detector alerts you to the light in an environment with an audible tone -- Microsoft claims the tool will save users from having to touch a hot bulb or LED battery to check if it's on. Despite the big update, there's still no word on an Android launch.

Source: Microsoft