GEN: Highly Efficient SMILES Explorer Using Autodidactic Generative Examination Networks
Recurrent neural networks have been widely used to generate millions of de novo molecules in a known chemical space. These deep generative models are typically setup with LSTM or GRU units and trained with canonical SMILEs. In this study, we introduce a new robust architecture, Generative Examination Networks GEN, based on bidirectional RNNs with concatenated sub-models to learn and generate molecular SMILES with a trained target space. GENs autonomously learn the target space in a few epochs while being subjected to an independent online examination mechanism to measure the quality of the generated set. Here we have used online statistical quality control (SQC) on the percentage of valid molecules SMILES as an examination measure to select the earliest available stable model weights. Very high levels of valid SMILES (95-98 be generated using multiple parallel encoding layers in combination with SMILES augmentation using unrestricted SMILES randomization. Our architecture combines an excellent novelty rate (85-90 conservation of the property space (95-99 is open to other quality criteria.
READ FULL TEXT