Can Foundation Models Wrangle Your Data?

05/20/2022
by   Avanika Narayan, et al.
44

Foundation Models (FMs) are models trained on large corpora of data that, at very large scale, can generalize to new tasks without any task-specific finetuning. As these models continue to grow in size, innovations continue to push the boundaries of what these models can do on language and image tasks. This paper aims to understand an underexplored area of FMs: classical data tasks like cleaning and integration. As a proof-of-concept, we cast three data cleaning and integration tasks as prompting tasks and evaluate the performance of FMs on these tasks. We find that large FMs generalize and achieve SoTA performance on data cleaning and integration tasks, even though they are not trained for these data tasks. We identify specific research challenges and opportunities that these models present, including challenges with private and temporal data, and opportunities to make data driven systems more accessible to non-experts. We make our code and experiments publicly available at: https://github.com/HazyResearch/fm_data_tasks.

READ FULL TEXT

page 1

page 2

page 10

research
05/01/2023

In-Context Learning Unlocked for Diffusion Models

We present Prompt Diffusion, a framework for enabling in-context learnin...
research
06/09/2023

On the Challenges and Perspectives of Foundation Models for Medical Image Analysis

This article discusses the opportunities, applications and future direct...
research
04/13/2023

On the Opportunities and Challenges of Foundation Models for Geospatial Artificial Intelligence

Large pre-trained models, also known as foundation models (FMs), are tra...
research
06/15/2023

ViP: A Differentially Private Foundation Model for Computer Vision

Artificial intelligence (AI) has seen a tremendous surge in capabilities...
research
06/16/2023

CHORUS: Foundation Models for Unified Data Discovery and Exploration

We explore the application of foundation models to data discovery and ex...
research
12/01/2022

Data-Efficient Finetuning Using Cross-Task Nearest Neighbors

Language models trained on massive prompted multitask datasets like T0 (...
research
08/18/2022

Challenges and opportunities in applying Neural Temporal Point Processes to large scale industry data

In this work, we identify open research opportunities in applying Neural...

Please sign up or login with your details

Forgot password? Click here to reset