An attractive theory for the occurrence of spiral patterns in simulated and, hopefully, real disc galaxies is based on the idea that the underlying stellar disc is linearly unstable and spontaneously grows eigenmodes. These rotating, overlapping modes then form the changing, transient patterns that are observed in simulated discs (Sellwood & Carlberg 2014). This obviously begs the question why these discs are linearly unstable to begin with. Using the linearized Boltzmann equation, I investigate how grooves carved in the phase space of a stellar disc can trigger the vigorous growth of two-armed spiral eigenmodes (De Rijcke, Fouvry, Pichon 2019). Such grooves result from the self-induced dynamics of a disc subject to finite-N shot noise, as swing-amplified noise patterns push stars towards lower angular momentum orbits at their inner Lindblad radius (Sellwood 2012, Fouvry et al. 2015). I provide evidence that the depletion of near-circular orbits, and not the addition of radial orbits, is the crucial physical ingredient that causes these new eigenmodes. Thus, it is possible for an isolated, linearly stable stellar disc to spontaneously become linearly unstable via the self-induced formation of phase-space grooves through finite-N dynamics. In order to be able to compare the linear stability computations directly with N-body simulations, they were equipped with gravitational softening (De Rijcke, Fouvry, Dehnen 2019). I also show some first results obtained using this linear stability code with the inclusion of the gravitational coupling between a stellar disc and a cooling gas disc, which enables the search for eigenmodes in the star+gas system.