Ideally, artificial intelligence is free from human biases and emotions, and can therefore make decisions free from prejudice. But AI is built by humans, and humans aren't perfect; their inherent biases reflect themselves through their work. Documented examples include everything from language algorithms learning to associate the word “man" with the word “professor" and the word “woman" with “assistant professor," to ratings of the cuteness of puppies.
How do we detect and work with model bias in AI, and what are the greater implications? That's the central subject of Singapore City AI's next quarterly event, scheduled for Thursday, August 17, 2017, 5pm-9pm at Unilever in Singapore.
Hosted in conjunction with coworking pace LEVEL3, Eventnook, Innovator SG, and 100offer, the event will feature a variety of speakers on different aspects of model bias as well as refreshments. Here's a taste of what you can expect. Register for the event here.
What do cute puppies have to do with model bias in AI? Quite a bit, actually.\
Model bias is best explained by starting with, well, a definition of the term. Wan Ting Poh, data scientist at Allianz and managing director of Girls in Tech Singapore, will explain model bias, then go into how it shows up in behavior, and in racial and gender bias.
With a focus on practical applications, she'll address tips for debugging learning algorithms, linking model bias, and diagnosing through statistical bias and variance.
Zane Lim, senior data scientist at Go-Jek, will speak on ranking drivers in real time and how Go-Jek identifies potential bias to keep things fair. He'll also touch on technical ways - including gradient boosting and neural nets - to interpret complex models.
MSD associate director of global data science competency Jason T. Widjaja works with data scientists to test and create solutions in the commercial, manufacturing, IT, and HR sectors. He'll be speaking about training models for people and talent analytics - namely, the HR analytics use cases that help train people.
Jason will present solutions that combat model bias in organizations that deal with machine learning models on data involving people.
The event will also feature as speaker Simon See, director and chief solution architect at Nvidia AI Technology Centre. Monk's Hill Ventures partner Peng Ong will be part of a panel. Don't forget to register! We'll see you there.
Are you a data scientist looking for your next challenging and rewarding opportunity? Are you a company looking for a trailblazing data scientist that'll put you on the cutting edge? You've come to the right place. 100offer connects companies with curated tech talent. Check out our platform and find your next standout employee here.