Biologically Inspired Design Concept Generation Using Generative Pre-Trained Transformers

by   Qihao Zhu, et al.

Biological systems in nature have evolved for millions of years to adapt and survive the environment. Many features they developed can be inspirational and beneficial for solving technical problems in modern industries. This leads to a specific form of design-by-analogy called bio-inspired design (BID). Although BID as a design method has been proven beneficial, the gap between biology and engineering continuously hinders designers from effectively applying the method. Therefore, we explore the recent advance of artificial intelligence (AI) for a data-driven approach to bridge the gap. This paper proposes a generative design approach based on the generative pre-trained language model (PLM) to automatically retrieve and map biological analogy and generate BID in the form of natural language. The latest generative pre-trained transformer, namely GPT-3, is used as the base PLM. Three types of design concept generators are identified and fine-tuned from the PLM according to the looseness of the problem space representation. Machine evaluators are also fine-tuned to assess the mapping relevancy between the domains within the generated BID concepts. The approach is evaluated and then employed in a real-world project of designing light-weighted flying cars during its conceptual design phase The results show our approach can generate BID concepts with good performance.


Generative Pre-Trained Transformer for Design Concept Generation: An Exploration

Novel concepts are essential for design innovation and can be generated ...

Generative Transformers for Design Concept Generation

Generating novel and useful concepts is essential during the early desig...

ChatGPT: Applications, Opportunities, and Threats

Developed by OpenAI, ChatGPT (Conditional Generative Pre-trained Transfo...

An Evaluation of Generative Pre-Training Model-based Therapy Chatbot for Caregivers

With the advent of off-the-shelf intelligent home products and broader i...

Auto-Learning: An Adversarial Process of Two Pre-trained Models for Natural Language Generation

Pre-trained models have been used in many fields in recent years, rangin...

ASTROMER: A transformer-based embedding for the representation of light curves

Taking inspiration from natural language embeddings, we present ASTROMER...

BioinspiredLLM: Conversational Large Language Model for the Mechanics of Biological and Bio-inspired Materials

The study of biological materials and bio-inspired materials science is ...

Please sign up or login with your details

Forgot password? Click here to reset