There is an endless literature that studies whether the minimum wage has positive or negative employment effects, spurred in particular by the much discussed Card and Krueger result that the relationship is positive. Most subsequent papers argues for the contrary, both empirically and theoretically.
Fabian Slonimczyk and Peter Skott appear to revive the debate with a theoretical contribution using a model where workers have an incentive to shirk on the job. Suppose there are two types of workers, skilled and unskilled, and two types of jobs, good and bad. When monitoring is imperfect and workers cannot pre-commit not to shirk, firms will threaten dismissal for bad effort, and need to keep wages high. This induces rationing of jobs, and thus some skilled workers end up in bad jobs, and some unskilled workers end up unemployed. This mismatch is particularly bad because there is over-education. Now introduce or increase a minimum wage. As long as it bites and firms prefer low-skill workers in low-tech jobs, it relaxes the no-shirking condition for low-skill workers, and thus increases their employment and wage, with little consequences for high-skill workers. If, however, firms prefer high-skill workers in low-tech jobs, there is an increase in high-skill employment and a decrease in low-skill employment. The key here is that heterogeneous skills create monopsonies and job-skill mismatches.
Slonimczyk and Skott then try to back their results up with some empirics. But the short samples do not lead me to have much confidence in those. It would be interesting to see someone test this theory with better data.