Teaching Perception

11/21/2019
by   Jonathan Connell, et al.
0

The visual world is very rich and generally too complex to perceive in its entirety. Yet only certain features are typically required to adequately perform some task in a given situation. Rather than hardwire-in decisions about when and what to sense, this paper describes a robotic system whose behavioral policy can be set by verbal instructions it receives. These capabilities are demonstrated in an associated video showing the fully implemented system guiding the perception of a physical robot in simple scenario. The structure and functioning of the underlying natural language based symbolic reasoning system is also discussed.

READ FULL TEXT

page 2

page 6

research
06/03/2018

MaestROB: A Robotics Framework for Integrated Orchestration of Low-Level Control and High-Level Reasoning

This paper describes a framework called MaestROB. It is designed to make...
research
10/02/2017

Visual Reasoning with Natural Language

Natural language provides a widely accessible and expressive interface f...
research
06/01/2020

Translating Natural Language Instructions for Behavioral Robot Navigation with a Multi-Head Attention Mechanism

We propose a multi-head attention mechanism as a blending layer in a neu...
research
08/18/2021

ARDOP: A Versatile Humanoid Robotic Research Platform

This paper describes the development of a humanoid robot called ARDOP. T...
research
10/22/2022

DANLI: Deliberative Agent for Following Natural Language Instructions

Recent years have seen an increasing amount of work on embodied AI agent...
research
06/13/2017

The "something something" video database for learning and evaluating visual common sense

Neural networks trained on datasets such as ImageNet have led to major a...
research
07/16/2020

Enabling Morally Sensitive Robotic Clarification Requests

The design of current natural language oriented robot architectures enab...

Please sign up or login with your details

Forgot password? Click here to reset