paint-brush
Use plaidML to do Machine Learning on macOS with an AMD GPUby@Alex_Wulff
1,865 reads
1,865 reads

Use plaidML to do Machine Learning on macOS with an AMD GPU

by Alex WulffDecember 16th, 2020
Read on Terminal Reader
Read this story w/o Javascript
tldt arrow

Too Long; Didn't Read

Want to train machine learning models on your Mac’s integrated AMD GPU or an external graphics card? Look no further than PlaidML.
featured image - Use plaidML to do Machine Learning on macOS with an AMD GPU
Alex Wulff HackerNoon profile picture
Alex Wulff

Alex Wulff

@Alex_Wulff

L O A D I N G
. . . comments & more!

About Author

Alex Wulff HackerNoon profile picture
Alex Wulff@Alex_Wulff

TOPICS

THIS ARTICLE WAS FEATURED IN...

Permanent on Arweave
Read on Terminal Reader
Read this story in a terminal
 Terminal
Read this story w/o Javascript
Read this story w/o Javascript
 Lite