DTT: An Example-Driven Tabular Transformer by Leveraging Large Language Models

03/12/2023
by   Arash Dargahi Nobari, et al.
0

Many organizations rely on data from government and third-party sources, and those sources and organizations do not follow the same data formatting. This introduces challenges in integrating data from multiple sources. Commercial database systems do not offer adequate support for integrating data from heterogeneous sources, and manual integration is both time-consuming and inefficient. While state-of-the-art approaches rely on similarity functions and textual transformations, they often fail to handle challenging cases where multiple mappings are required, or the mappings go beyond simple textual transformations. In this paper, we study the potential of deep neural models for transforming tables for joinability. In particular, we cast the problem as a prediction task and develop a framework that leverages large deep-learning language models to transform tabular data from a source formatting to a desired target representation. Our framework can efficiently learn the pattern for mapping the source formatting into the expected target using just a few examples, which can then be used for table joining, filling in missing values, and error detection. Compared to state-of-the-art mapping and joining approaches, our framework delivers noticeably more accurate and scalable performance on both real-world and synthetic datasets. Our experimental evaluation also shows that the performance of the proposed framework using our fine-tuned model is at par or better than large language models such as GPT-3, despite the significant difference in size, and that integrating large language models into our framework improves their performance.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset