Theses and Dissertations

Date of Award

5-1-2025

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Computer Science

First Advisor

Dongchul Kim

Second Advisor

Bin Fu

Third Advisor

Haoteng Tang

Abstract

This work explores the application of Transformer models to robotic skill learning, aiming to enhance generalization across various physical tasks and environments with continuous control. Despite their success in other domains, our experiments reveal that the utility of Transformers in robotics heavily depends on pretraining strategies. Specifically, Transformers pretrained on reinforcement learning tasks generalized effectively, while those trained with task-agnostic masking strategies did not. These findings challenge assumptions about the universality of Transformer-based methods and underscore the importance of domain-aligned pretraining for developing versatile robotic agents.

Comments

Copyright 2025 Erik Enriquez. https://proquest.com/docview/3240612198

Share

COinS