Thanks to the rapid increase in computational capability during the latest years, traditional and more explainable methods have been gradually replaced by more complex deep-learning-based approaches, which have in fact reached new state-of-the-art results for a variety of tasks. However, for certain kinds of applications performance alone is not enough. A prime example is represented by the medical field, in which building trust between the physicians and the AI models is fundamental. Providing an explainable or trustful model, however, is not a trivial task, considering the black-box nature of deep-learning based methods. While some existing methods, such as gradient or saliency maps, try to provide insights about the functioning of deep neural networks, they often provide limited information with regards to clinical needs. We propose a two-step diagnostic approach for the detection of Covid-19 infection from Chest X-Ray images. Our approach is designed to mimic the diagnosis process of human radiologists: it detects objective radiological findings in the lungs, which are then employed for making a final Covid-19 diagnosis. We believe that this kind of structural explainability can be preferable in this context. The proposed approach achieves promising performance in Covid-19 detection, compatible with expert human radiologists. Moreover, despite this work being focused Covid-19, we believe that this approach could be employed for many different CXR-based diagnosis.