Deep Learning, Neural Networks, Machine Learning
- When using the slope obtained by differentiation with methods like gradient descent, it is common to get stuck in a local minimum.
- Solution @Udacity_Intro_to_Deep_Learning_with_PyTorch:
- Random Restart: Simply restart from various points.
- Introduce inertia to the movement: Use the record of changes up to a few steps ago to give inertia to the optimization movement (like a rolling ball).
- Increase the learning rate to be able to leap over.
- And so on.
Design for Whom?
- There were two types of innovation that had the same discussion.
- Incremental changes, or in other words, incremental improvements.
- Radical changes that completely transform everything.
- It might be possible to apply engineering methods to the knowledge of#problem Identification, Resolution, and Planning.
- Very interesting, I like this kind of application.
- #generalization