As a scene graph compactly summarizes the high-level content of an image in a structured and symbolic manner, the similarity between scene graphs of two images reflects the relevance of their contents. Based on this idea, we investigate a novel approach for image-to-image retrieval using scene graph similarity measured by graph neural networks. With the methods provided in this paper, graph neural networks successfully capture the similarity between scene graphs. When applied to the image retrieval task, graph neural networks achieve higher agreement to human perception of image similarity than other competitive baselines in our human evaluation. As shown in our experiments, the proposed approach performs equally well on machine-generated scene graphs, indicating the broad applicability of our method.