Controlled Text Generation using T5 based Encoder-Decoder Soft Prompt Tuning and Analysis of the Utility of Generated Text in AI

Controlled text generation is a very important task in the arena of natural language processing due to its promising applications. In order to achieve this task we mainly introduce the novel soft prompt tuning method of using soft prompts at both encoder and decoder levels together in a T5 model and investigate the performance as the behaviour of an additional soft prompt related to the decoder of a T5 model in controlled text generation remained unexplored. Then we also investigate the feasibility of steering the output of this extended soft prompted T5 model at decoder level and finally analyse the utility of generated text to be used in AI related tasks such as training AI models with an interpretability analysis of the classifier trained with synthetic text, as there is a lack of proper analysis of methodologies in generating properly labelled data to be utilized in AI tasks. Through the performed in-depth intrinsic and extrinsic evaluations of this generation model along with the artificially generated data, we found that this model produced better results compared to the T5 model with a single soft prompt at encoder level and the sentiment classifier trained using this artificially generated data can produce comparable classification results to the results of a classifier trained with real labelled data and also the classifier decision is interpretable with respect to the input text content.


Text Generation with Exemplar-based Adaptive Decoding

We propose a novel conditioned text generation model. It draws inspirati...

Select and Attend: Towards Controllable Content Selection in Text Generation

Many text generation tasks naturally contain two steps: content selectio...

Discourse Embellishment Using a Deep Encoder-Decoder Network

We suggest a new NLG task in the context of the discourse generation pip...

A Semi-Supervised Approach for Low-Resourced Text Generation

Recently, encoder-decoder neural models have achieved great success on t...

Controlled Text Generation for Data Augmentation in Intelligent Artificial Agents

Data availability is a bottleneck during early stages of development of ...

Bootstrapping Generators from Noisy Data

A core step in statistical data-to-text generation concerns learning cor...

A Hierarchical Model for Data-to-Text Generation

Transcribing structured data into natural language descriptions has emer...

Please sign up or login with your details

Forgot password? Click here to reset