11:04 09/11/2019 | 7newstar.com
Total post : 1,346
Applause is looking to reinvent AI testing with a new service by crowdsourcing larger training data sets
(Tech) A key facet of the Applause platform is not only the sheer number of crowd testers in its community, but the demographic diversity - spanning language, race, gender, location, culture, hobbies, and more. This will likely be among the main selling points as Applause looks to reappropriate its technology to offer companies access to diverse AI training data.
Applause’s AI training and testing service is offered across five core AI types covering voice, optical character recognition (OCR), image recognition, biometrics, and chatbots. If, for example, a company needs to quickly source varied training data for a virtual voice assistant, Applause users in various locales could be called upon to record and submit specific utterances. Equally, they could submit photos of objects or places or interact with chatbots to iron out any bias. They could even be asked to submit selfies and fingerprints if they’re testing biometric-based security products.
Perhaps more importantly, Applause promises speed and scale for both gathering training data and testing the outputs, allowing companies to garner rapid and iterative feedback from end users in real time. This could work like an ongoing feedback loop, with the gathered data used to improve AI algorithms and then retested on the the Applause community.
Similar initiatives out there at the moment include Amazon’s Mechanical Turk, which can be used to crowdsource data for machine learning experiments; DefinedCrowd, which helps create bespoke data sets for AI model training; and Germany’s Clickworker, which specifically focuses on machine vision and conversational AI.
Thanks to more than a decade of software testing with some of the biggest tech companies in the world, however, Applause is well-positioned to harness its existing presence in the developer community and offer vetted crowd testers to improve AI applications by reducing bias.s