论文标题
动态尺度的深层规范相关分析
Dynamically-Scaled Deep Canonical Correlation Analysis
论文作者
论文摘要
规范相关分析(CCA)是通过找到它们的最大相关线性投影来提取两种视图的方法。文献中已经引入了CCA的几种变体,尤其是基于深层神经网络的变体,用于学习两种观点的高度相关的非线性转换。由于这些模型是常规化的参数化,因此它们的可学习参数仍然独立于训练过程后的输入,这可能会限制其学习能力高度相关表示。我们引入了一种新型的动态缩放方法,用于训练输入依赖性规范相关模型。在我们的Deep-CCA模型中,最后一层的参数由第二个神经网络缩放,该神经网络以模型的输入为条件,从而导致参数化取决于输入样本。我们在多个数据集上评估了我们的模型,并证明与传统参数化的基于CCA的模型相比,学习的表示形式更加相关,并且还获得了可取的检索结果。我们的代码可在https://github.com/tomerfr/dynamatelyscaleddeepcca上找到。
Canonical Correlation Analysis (CCA) is a method for feature extraction of two views by finding maximally correlated linear projections of them. Several variants of CCA have been introduced in the literature, in particular, variants based on deep neural networks for learning highly correlated nonlinear transformations of two views. As these models are parameterized conventionally, their learnable parameters remain independent of the inputs after the training process, which may limit their capacity for learning highly correlated representations. We introduce a novel dynamic scaling method for training an input-dependent canonical correlation model. In our deep-CCA models, the parameters of the last layer are scaled by a second neural network that is conditioned on the model's input, resulting in a parameterization that is dependent on the input samples. We evaluate our model on multiple datasets and demonstrate that the learned representations are more correlated in comparison to the conventionally-parameterized CCA-based models and also obtain preferable retrieval results. Our code is available at https://github.com/tomerfr/DynamicallyScaledDeepCCA.