Yesterday afternoon I attended a CSE Seminar titled, “A Gentle Introduction to Machine Learning.” It was presented by Lu Wang, who is a PhD candidate at Cornell University. I decided to attend because it was on a topic I find quite interesting, but mostly since it was titled as a “gentle introduction” to the topic of machine learning. It turned out to be just as I hoped. Wang gave a very high level lecture on the topic of machine learning, which allowed for many in the room not proficient in the subject to take something away.
She first described some motivations behind machine learning, its applications, as well as an overview of specific tasks in this field. Afterward, she showed the formulation of classification, a typical machine learning task. Then came the meat of her lecture, introducing and explaining K-Nearest Neighbors (K-NN), a popular classification algorithm.
To help the audience understand K-NN and how it worked, she used movie preference predictions as a working example. She deemed movie preference prediction as the “million dollar question” due to the fact back in 2006-2007 Netflix created an open competition for anyone to try and beat their current algorithm for ratings predictions with a grand prize of $1M given to the winning team. So, Wang went through the complete workflow of K-NN using a simple example of a user who rated 4 different movies and needing to figure out if she’d like a new movie. It used simple probability and a couple other factors based on likeness of movie attributes (genre, actor, etc) to eventually predict if said user will like the new movie. At the end, she introduced various similarity and distance metrics, which are suitable to different data attributes in solving machine learning tasks. All in all, it was a fantastic seminar and I felt I was able to take a lot away from it.
Here are some pictures from seminar: