Too Long; Didn't Read
This blog post is the fourth one in the 5-minute Papers series. In this post, I’ll give highlights from the Paper <a href="https://arxiv.org/pdf/1903.05987.pdf" target="_blank">“To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks”</a> by Matthew Peters et al,<a href="https://hackernoon.com/interview-with-deep-learning-and-nlp-researcher-sebastian-ruder-91ddaf473c4b" target="_blank"> Sebastian Ruder et al</a>