분류 전체보기 (42) 썸네일형 리스트형 EfficientML.ai Lecture 2 Pruning and Sparsity Part I EfficientML.ai Lecture 1 Basics of Neural Network YOLOv10 1. Abstract We aim to further advance the performance-efficiency boundary of YOLOs from both the post-processing and the model architecture. We first tackle the problem of redundant predictions in the post-processing by presenting a consistent dual assignments strategy for NMS-free YOLOs with the dual label assignments { one-to-many head and one-to-one head } and consistent matching metric.It al.. Contrastive Representation Learning : 1. Contrastive Training Objectives Reference from : https://lilianweng.github.io/posts/2021-05-31-contrastive/#contrastive-loss Contrastive Representation LearningThe goal of contrastive representation learning is to learn such an embedding space in which similar sample pairs stay close to each other while dissimilar ones are far apart. Contrastive learning can be applied to both supervised and unsupervised settingslilianweng.git.. VQ-VAE : Neural Discrete Representation Learning VICREG : Variance-Invariance-Covariance Regularization for self-supervised learning (ICLR 2022) & Distilling Representations from GAN Generator via Squeeze and Span (NeurIPS 2022) InfinityGAN: Towards Infinite-Pixel Image Synthesis Rosetta Neurons: Mining the Common Units in a model Zoo (ICCV 2023) 이전 1 2 3 4 5 6 다음 목록 더보기