Education, Science, Technology, Innovation and Life
Open Access
Sign In

AI-Driven Reverse Engineering of Biomimetic Structures via GNN-GAN Synergy

Download as PDF

DOI: 10.23977/jaip.2025.080405 | Downloads: 4 | Views: 58

Author(s)

Baixin Pan 1

Affiliation(s)

1 The University of Hong Kong, Hong Kong, China

Corresponding Author

Baixin Pan

ABSTRACT

This article explores a novel hybrid model that combines a Graph Neural Network (GNN) with a Generative Adversarial Network (GAN) to address the challenge of generating novel biomimetic graphs with desired properties. The central hypothesis is that this synergistic framework can learn the structural grammar of biomimetic systems and the mapping between structure and function. We demonstrate how a GNN-based property loss can be used to guide the generator during training, discuss optimal architectural design choices, and outline the integration of a GNN-based property predictor into a conditional GAN framework. In addition, we propose a comprehensive multi-metric evaluation framework, present strategies to mitigate training instability and mode collapse, and address effective graph-based representations of biomimetic structures. This research aims to move beyond traditional forward design and enable efficient inverse design for applications in materials science, drug discovery, and tissue engineering.

KEYWORDS

GNN-GAN; Biomimetic Structures; Inverse Design; Molecular Generation; Graph Neural Networks; Generative Adversarial Networks; Wasserstein GAN

CITE THIS PAPER

Baixin Pan, AI-Driven Reverse Engineering of Biomimetic Structures via GNN-GAN Synergy. Journal of Artificial Intelligence Practice (2025) Vol. 8: 32-48. DOI: http://dx.doi.org/10.23977/jaip.2025.080405.

REFERENCES

[1] Hoogeboom, E.; Satorras, V. G.; Vignac, C.; Welling, M., 2022. Equivariant diffusion for molecule generation in 3D. In Proceedings of the International Conference on Machine Learning (ICML), Baltimore, MD, USA, 17–23 July 2022; pp. 8867–8887.
[2] Merchant, A.; Batzner, S.; Schoenholz, S. S.; Aykol, M.; Montoya, J. H.; Cubuk, E. D., 2023. Scaling deep learning for materials discovery. Nature, 624, 80–85.
[3] Jin, W.; Barzilay, R.; Jaakkola, T., 2018. Junction tree variational autoencoder for molecular graph generation. In Proceedings of the International Conference on Machine Learning (ICML), Stockholm, Sweden, 10–15 July 2018; pp. 2323–2332.
[4] Fink, T.; Reymond, J.-L., 2007. Virtual exploration of the chemical universe up to 17 atoms: The GDB-17 database. J. Chem. Inf. Model., 47, 342–353.
[5] De Cao, N.; Kipf, T., 2018. MolGAN: An implicit generative model for small molecular graphs. arXiv Prepr., arXiv:1805.11973.
[6] Zeni, C.; Bietti, A.; Burns, K.; Hu, N.; Ligett, K.; Swersky, K., 2024. MatterGen: A generative model for inorganic materials design. arXiv Prepr., arXiv:2312.03687, submitted.
[7] Wieder, O.; Kohlbacher, S.; Kuenemann, M.; Garon, A.; Ducrot, P.; Seidel, T.; Langer, T., 2020. A compact review of molecular property prediction with graph neural networks. Drug Discov. Today Technol., 37, 1–12.
[8] Li, Y.; Zhang, L.; Liu, Z., 2018. Multi-objective de novo drug design with conditional graph generative model. J. Cheminform., 10, 33.
[9] Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Sun, M., 2020. Graph neural networks: A review of methods and applications. AI Open, 1, 57–81.
[10] Court, C. J.; Cole, J. M., 2020. Auto-generated materials database: Linking microstructure to properties with graph neural networks. npj Comput. Mater., 6, 1–11.
[11] Yan, C.; Zhao, S.; Wang, Y., 2020. Motif-based graph neural networks for molecular property prediction. arXiv Prepr., arXiv:2010.04713, submitted.
[12] Karamad, M.; Magar, R.; Shi, Y.; Siahrostami, S.; Gates, I. D.; Barati Farimani, A., 2020. Orbital graph convolutional neural network for material property prediction. Phys. Rev. Mater., 4, 093801.
[13] Kipf, T. N.; Welling, M., 2017. Semi-supervised classification with graph convolutional networks. In Proceedings of the International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017.
[14] Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Liò, P.; Bengio, Y., 2018. Graph attention networks. In Proceedings of the International Conference on Learning Representations (ICLR), Vancouver, Canada, 30 April–3 May 2018.
[15] Hamilton, W. L.; Ying, R.; Leskovec, J., 2017. Inductive representation learning on large graphs. In Advances in Neural Information Processing Systems (NeurIPS), Long Beach, CA, USA, 4–9 December 2017; pp. 1024–1034.
[16] Han, J.; Rong, Y.; Xu, T.; Huang, W., 2022. Multi-view graph neural networks for molecular property prediction. arXiv Prepr., arXiv:2205.13671.
[17] Goodfellow, I.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Bengio, Y., 2014. Generative adversarial nets. In Advances in Neural Information Processing Systems (NeurIPS), Montreal, Canada, 8–13 December 2014; pp. 2672–2680.
[18] Mirza, M.; Osindero, S., 2014. Conditional generative adversarial nets. arXiv Prepr., arXiv:1411.1784.
[19] Saxena, D.; Cao, J.; Snoek, J., 2021. On the challenges of generative modeling for molecule generation. arXiv Prepr., arXiv:2102.13557.
[20] Saxena, D.; Cao, J., 2021. Generative modeling of molecular graphs: Challenges and opportunities. Chem. Sci., 12, 11669–11681.
[21] Arjovsky, M.; Bottou, L., 2017. Towards principled methods for training generative adversarial networks. In Proceedings of the International Conference on Learning Representations (ICLR), Toulon, France, 24–26 April 2017.
[22] Jin, W.; Barzilay, R.; Jaakkola, T., 2020. Conditional generation of molecules from disentangled representations. In Proceedings of the International Conference on Machine Learning (ICML), Vienna, Austria, 10–15 July 2020; pp. 8867–8887.
[23] Arjovsky, M.; Chintala, S.; Bottou, L., 2017. Wasserstein generative adversarial networks. In Proceedings of the International Conference on Machine Learning (ICML), Sydney, Australia, 6–11 August 2017; pp. 214–223.
[24] Gulrajani, I.; Ahmed, F.; Arjovsky, M.; Dumoulin, V.; Courville, A., 2017. Improved training of Wasserstein GANs. In Advances in Neural Information Processing Systems (NeurIPS), Long Beach, CA, USA, 4–9 December 2017; pp. 5767–5777.
[25] Miyato, T.; Kataoka, T.; Koyama, M.; Yoshida, Y., 2018. Spectral normalization for generative adversarial networks. In Proceedings of the International Conference on Learning Representations (ICLR), Vancouver, Canada, 30 April–3 May 2018.
[26] Wei, X.; Gong, B.; Liu, Z.; Lu, W.; Wang, L., 2018. Improving the improved training of Wasserstein GANs: A consistency term and its dual effect. In Proceedings of the International Conference on Learning Representations (ICLR), Vancouver, Canada, 30 April–3 May 2018.
[27] Guo, X.; Zhao, L., 2020. A systematic survey on deep generative models for graph generation. arXiv Prepr., arXiv:2007.13673.
[28] Thanh-Tung, H.; Tran, T., 2020. Catastrophic forgetting and mode collapse in GANs. In Proceedings of the International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–8.
[29] Gretton, A.; Borgwardt, K. M.; Rasch, M. J.; Schölkopf, B.; Smola, A., 2012. A kernel two-sample test. J. Mach. Learn. Res., 13, 723–773.
[30] Xu, K.; Hu, W.; Leskovec, J.; Jegelka, S., 2019. How powerful are graph neural networks? In Proceedings of the International Conference on Learning Representations (ICLR), New Orleans, LA, USA, 6–9 May 2019.
[31] You, J.; Liu, B.; Ying, R.; Pande, V.; Leskovec, J., 2018. Graph convolutional policy network for goal-directed molecular graph generation. In Advances in Neural Information Processing Systems (NeurIPS), Montreal, Canada, 3–8 December 2018; pp. 6410–6421.
[32] Preuer, K.; Renz, P.; Unterthiner, T.; Hochreiter, S.; Klambauer, G., 2018. Fréchet ChemNet Distance: A metric for generative models for molecules. arXiv Prepr., arXiv:1802.09544.
[33] Vignac, C.; Krawczuk, I.; Siraudin, A.; Wang, B.; Adams, R. P.; Welling, M., 2023. DiGress: Discrete denoising diffusion for graph generation. In Proceedings of the International Conference on Learning Representations (ICLR), Kigali, Rwanda, 1–5 May 2023.
[34] Martinkus, K.; Roth, P.; Jaggi, M., 2023. TIGGER: Scalable generative modelling for temporal interaction graphs. arXiv Prepr., arXiv:2307.01364.
[35] Gutteridge, B.; Dong, X.; Bronstein, M.; Di Battista, G., 2024. G²PM: A graph pattern machine for large-scale graph generation. arXiv Prepr., arXiv:2402.14966.
[36] Edwards, C.; Lai, T.; Oei, K.; Zhuo, H. H.; Zhang, Y.; Alon, U., 2024. Text-to-graph generation: Methods and challenges. arXiv Prepr., arXiv:2408.00957.

Downloads: 16776
Visits: 594781

Sponsors, Associates, and Links


All published work is licensed under a Creative Commons Attribution 4.0 International License.

Copyright © 2016 - 2031 Clausius Scientific Press Inc. All Rights Reserved.