The Annotated Transformer Revisited

In this article we have an illustrated annotated look at the Transformer published in “Attention is all you need” in 2017 by Vaswani, Shazeer, Parmer, et al. The Transformer architecture was groundbraking as it achieves 28.4 BLEU on the WMT 2014 English-to-German translation task with comparatively very little training. Even though it is eclipsed by the “Reformer: The Efficient Transformer” published by Nikita Kitaev, Łukasz Kaiser and Anselm Levskayain in this year/2020, it is still interesting to have a look at the fundamental idea of the comparatively “simple network architecture […] based solely on attention mechanisms”. ...

February 22, 2020 · 980 words

Getting started with offline Voice Recognition

With voice assistants becoming ubiquitous we get used to things listening to us. In this article we explore how to get a headless RaspberryPi Zero W to be our offline voice assistant. ...

January 26, 2020 · 1193 words

Getting started with Coral Edge TPUs

After reading up some nice stats on Google’s Coral Edge TPUs by Sam Sterckval at @RacoonsGroup, I got curious. Let’s see how this works. ...

January 25, 2020 · 888 words