Differentiable solvers allow to learn physics-informed machine-learning (ML) surrogate models by backpropagating gradient information through multiple integration steps of the underlying partial differential equation. We leverage automatic differentiation capabilities of our in-house finite-volume JAX-Fluids CFD framework to devise a novel and powerful data-driven subgrid-scale (SGS) model for machine-learned implicit large-eddy simulation (ML-ILES) of compressible flows. Our SGS modelling strategy is based on an optimized physics-informed deconvolution algorithm. Contrary to standard smoothness-based reconstruction strategies like WENO or TENO, we utilize neural network-based reconstruction algorithms for each physical quantity separately such that underresolved turbulent structures are best approximated. Hereby, the deconvolution process is enriched by latent information of the encoded local flow state. The data-driven model is trained on coarse-grained spatio-temporal trajectories of direct numerical simulations (DNS), see Figure 1. First, we apply the novel SGS model to homogenous isotropic turbulence and demonstrate that due to the chosen ML-ILES ansatz the SGS model generalizes well to various spatial resolutions, Mach numbers, and Reynolds numbers multiple orders of magnitude higher than the training data. Second, we expand the SGS model to wall-bounded flows, in particular to turbulent channel flows. The implicit SGS model is able to detect underresolved turbulent structures in the near-wall region. Enriched by the encoded local flow state, the adaptive deconvolution process seamlessly accounts for the local flow anisotropy. We demonstrate that the novel ML-ILES ansatz yields improved results for wall-bounded flows while imposing less stringent near-wall resolution requirements, thereby reducing computational costs drastically.