Quo Vadis?
👋 Contact
  • Jul 7, 2021 deep-learning  paper-review 

    Cross Modal Focal Loss for RGBD Face Anti-Spoofing

    This is a summary and review of the recent paper on face anti-spoofing by George and Marcel 2021, presented at CVPR 2021.

  • Jul 6, 2021 deep-learning  paper-review 

    Data-Efficient Image Transformers

    This is the next post in the series on the ImageNet leaderboard and it takes us to place #71 – Training data-efficient image transformers & distillation through attention. The visual transformers paper showed that it is possible for transformers to surpass CNNs on visual tasks, but doing so takes hundreds of millions of images and hundreds if not thousands of compute-days on TPUs. This paper shows how to train transformers using only ImageNet data on GPUs in a few days.

  • Jul 6, 2021 deep-learning 

    SGD, Adam and Weight Decay

    Optimization lies at the core of deep learning. And despite that how much time to we spend thinking about which optimizer to use when training the latest neural network? This blog post arose from me seeing the AdamW optimizer being used in paper on data-efficient image transormers and then looking for answers to the question: what is AdamW?

  • Jul 5, 2021 deep-learning  imagenet  paper-review 

    A Journey Along the ImageNet Leaderboard: Transformers

    This post is the first of a (planned) series that looks at the papers from the ImageNet leaderboard. As the top places are occupied by Transformers and EfficientNets, we will start our exploration with the 17th place – An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale – the first paper on visual transformers.

  • Jul 3, 2021 deep-learning 

    TensorFlow Model Surgery

    There are things that are very easy to achieve in TensorFlow 2 and there are things that are annoyingly hard and often the dividing line is surprisingly narrow. For example, it is easy to combine two models into a bigger model, but splitting a model into two parts is difficult. Here we explore some ways of achieving the latter.

Previous Page 2 of 8 Next
2021 © by Martins Bruveris. All Rights Reserved. Built by Jekyll. | Tags | Contact