ML@B Blog

Tricking Neural Networks: Create your own Adversarial Examples

Assassination by neural network. Sound crazy? Well, it might happen someday, and not in the way you may think. Of course neural networks could be trained to pilot drones or operate other weapons of mass destruction, but even an innocuous (and presently available) network trained to drive a car could be turned to act against its owner. This is because neural networks are extremely susceptible to something called adversarial examples.

Continue reading

Machine Learning Crash Course: Part 5 - Decision Trees and Ensemble Models

Trees are great. They provide food, air, shade, and all the other good stuff we enjoy in life. Decision trees, however, are even cooler. True to their name, decision trees allow us to figure out what to do with all the great data we have in life.

Like it or not, you have been working with decision trees your entire life. When you say, “If it’s raining, I will bring an umbrella,” you’ve just constructed a simple decision tree.

Continue reading

Machine Learning Crash Course: Part 4 - The Bias-Variance Dilemma

Here’s a riddle:

Continue reading

ML@B's New Projects

Every semester, Machine Learning at Berkeley takes on new projects. Here’s the latest scoop on the newest projects we’ve taken on this semester.

Continue reading

Machine Learning Crash Course: Part 3

Neural networks are perhaps one of the most exciting recent developments in machine learning. Got a problem? Just throw a neural net at it. Want to make a self-driving car? Throw a neural net at it. Want to fly a helicopter? Throw a neural net at it. Curious about the digestive cycles of your sheep? Heck, throw a neural net at it. This extremely powerful algorithm holds much promise (but can also be a bit overhyped). In this article we’ll go through how a neural network actually works, and in a future article we’ll discuss some of the limitations of these seemingly magical tools.

Continue reading

Share us on social media!

Hacker News
© Machine Learning at Berkeley 2018 • Please contact us for permission to republish any part of this website