분류 전체보기 (39) 썸네일형 리스트형 Contrastive Representation Learning : 1. Contrastive Training Objectives Reference from : https://lilianweng.github.io/posts/2021-05-31-contrastive/#contrastive-loss Contrastive Representation LearningThe goal of contrastive representation learning is to learn such an embedding space in which similar sample pairs stay close to each other while dissimilar ones are far apart. Contrastive learning can be applied to both supervised and unsupervised settingslilianweng.git.. VQ-VAE : Neural Discrete Representation Learning VICREG : Variance-Invariance-Covariance Regularization for self-supervised learning (ICLR 2022) & Distilling Representations from GAN Generator via Squeeze and Span (NeurIPS 2022) InfinityGAN: Towards Infinite-Pixel Image Synthesis Rosetta Neurons: Mining the Common Units in a model Zoo (ICCV 2023) OakInk: A Large-scale Knowledge Repository for Understanding Hand-Object Interaction Reusing the Task-specific Classifier as a Discriminator: Discriminator-free Adversarial Domain Adaptation MS Thesis 이전 1 2 3 4 5 다음