Hello!
HomeAboutResearchCareersBlog

FlashAttention: Fast Transformer training with long sequences

January 17, 2023

For the last 2 months I’ve been collaborating with Adept as a part-time research fellow and we’ve been developing some improvements to FlashAttention to make it even better! In this post, we describe one key improvement that we’re particularly excited about: making FlashAttention fast for long sequences to enable training large language models with longer context.

Read More >

ACT-1: Transformer for Actions

September 14, 2022

At Adept, we are building the next frontier of models that can take actions in the digital world—that’s why we’re excited to introduce our first large model, Action Transformer (ACT-1).

Read More >
Introducing Adept

Introducing Adept

David Luan
April 26, 2022

Adept is an AI research and product lab building general intelligence by enabling humans and computers to work together creatively. 

Read More >
Twitter
LinkedIn