I completed a course on machine learning on MOOC over the Christmas break by Caltech professor Yaser Abu-Mostafa. He was stressing on the techniques how to make a learning model created on a small sample of input data points reliably 'generalize' to the wider population. The concepts were fascinating. He said this single skill of making a model generalize to the (unseen) population differentiates the "pros" from the "amateurs".
Real life is so much the same. By nature, we all create 'mental models' and continuously shape them through all the encounters we experience in our work place, family and so on. And each person creates very different models from the same set of experiences. The ultimate aim is to apply these learnings to our future encounters.
In machine learning, there is this concept of over-fitting which kills the ability to generalize. The more you try to exactly fit a model to the small sample of training data, that is, the more you try to "uncover all patterns" in them - you will eventually be successful in doing that. But in the background, you actually start fitting to the "noise" in the data and learn little that is true in the general population.
Come to real life and it is the same. If you try to learn too much from the small sample of encounters you had (thus creating mental models like "all politicians must be dishonest", and so on) and then try to generalize, you fail. The best managers I have worked with are so good at consciously guarding themselves and their team against reading too much from an event or experience. You must of course learn from your experiences but then learn only what the data allows you to.
Real life is so much the same. By nature, we all create 'mental models' and continuously shape them through all the encounters we experience in our work place, family and so on. And each person creates very different models from the same set of experiences. The ultimate aim is to apply these learnings to our future encounters.
In machine learning, there is this concept of over-fitting which kills the ability to generalize. The more you try to exactly fit a model to the small sample of training data, that is, the more you try to "uncover all patterns" in them - you will eventually be successful in doing that. But in the background, you actually start fitting to the "noise" in the data and learn little that is true in the general population.
Come to real life and it is the same. If you try to learn too much from the small sample of encounters you had (thus creating mental models like "all politicians must be dishonest", and so on) and then try to generalize, you fail. The best managers I have worked with are so good at consciously guarding themselves and their team against reading too much from an event or experience. You must of course learn from your experiences but then learn only what the data allows you to.
No comments:
Post a Comment