Recent advances in machine learning have driven interest in "nonlinear random matrices," which involve the entry-wise application of deterministic nonlinear functions. A notable example is the class of "conjugate kernel matrices” of the form YY*, where Y = f(WX). Here, W and X are random rectangular matrices with i.i.d. centered entries representing weights and data in a two-layer feed-forward neural network, and f is a nonlinear activation function. This talk explores the asymptotic spectral behavior of such matrices under two distinct scenarios: (i) when W and X have light-tailed entries, and (ii) when W has heavy-tailed entries while X remains light-tailed. We will analyze the terms contributing to the moments of the limiting spectral measure in each case, focusing on the interaction between non-linearity and tail behaviour.