paint-brush
“To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks”: Paper Discussionby@init_27
733 reads
733 reads

“To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks”: Paper Discussion

by Sanyam Bhutani3mMarch 15th, 2019
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

This blog post is the fourth one in the 5-minute Papers series. In this post, I’ll give highlights from the Paper <a href="https://arxiv.org/pdf/1903.05987.pdf" target="_blank">“To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks”</a> by Matthew Peters et al,<a href="https://hackernoon.com/interview-with-deep-learning-and-nlp-researcher-sebastian-ruder-91ddaf473c4b" target="_blank"> Sebastian Ruder et al</a>

Companies Mentioned

Mention Thumbnail
Mention Thumbnail
featured image - “To Tune or Not to Tune? Adapting Pretrained Representations to Diverse Tasks”: Paper Discussion
Sanyam Bhutani HackerNoon profile picture
Sanyam Bhutani

Sanyam Bhutani

@init_27

👨‍💻 H2Oai 🎙 CTDS.Show & CTDS.News 👨‍🎓 fast.ai 🎲 Kaggle 3x Expert

About @init_27
LEARN MORE ABOUT @INIT_27'S
EXPERTISE AND PLACE ON THE INTERNET.
L O A D I N G
. . . comments & more!

About Author

Sanyam Bhutani HackerNoon profile picture
Sanyam Bhutani@init_27
👨‍💻 H2Oai 🎙 CTDS.Show & CTDS.News 👨‍🎓 fast.ai 🎲 Kaggle 3x Expert

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite