Discovery Synthesis Creation The Research Workspace

Research is curiosity. Curiosity is human. So research should feel human too—a natural extension of how you think and work.

Meet

Ada

The AI-native research workspace for deep thinkers.

Research is a living system.

Traditional tools fragment your thinking—papers in one place, annotations in another, writing somewhere else, and your ideas scattered everywhere.

Ada keeps everything connected in one living workspace, so context carries forward and your thinking compounds over time.

One workspace. One evolving context. One connected thread of thought.

LIBRARY READER NOTES BROWSER
arxiv.org/list/cs.CL/recent
Library
Attention Paper New
BERT Notes
GPT-3 Analysis
Reader
Notes
Canvas

Attention Is All You Need

Vaswani, Shazeer, Parmar, et al. · NeurIPS 2017

Abstract

The dominant sequence transduction models are based on complex recurrent or convolutional neural networks. The best performing models also connect the encoder and decoder through an attention mechanism.

1. Introduction

Recurrent neural networks, long short-term memory and gated recurrent neural networks in particular, have been firmly established as state of the art approaches.

Notes & Highlights
Highlight

"The best performing models also connect the encoder and decoder through an attention mechanism."

Page 1, Abstract

Note

Key insight: Attention replaces recurrence entirely → enables parallelization

Linked to [[Transformers]]

Question

How does this compare to BERT's bidirectional approach?

Ada Ready
What's the key innovation here?
The Transformer replaces recurrence with self-attention, enabling O(1) sequential ops and full parallelization.

Built for students, scholars, builders, and independent thinkers.

Unifying the creative loop.

From first search to final draft.

A focused space built for academic exploration

Search, save, highlight, cite. Everything stays connected as you move through the literature—not scattered across tabs.

  • Save papers to your library instantly
  • Highlight and annotate in-context
  • One-click citations in any format

No more hopping between applications

Move seamlessly from browsing to reading to writing. Your library holds what you've collected. Your notebooks turn sources into synthesis. Writing isn't a restart—it's the next step.

  • Structured library with read status
  • Notebooks pull in your highlights
  • Write with sources at your fingertips

AI that knows what you're thinking about

Get answers grounded in your papers, highlights, and notes—not generic responses. Ada understands your research context and responds accordingly.

  • AI knows your research context
  • Clarify terms, track arguments
  • Responses cite your sources

See connections and insights you would have missed

Lay out papers visually, connect themes across your library, and watch insights emerge. Some discoveries only happen when you can see the whole picture.

  • Visual canvas for papers & ideas
  • Draw connections between sources
  • AI surfaces hidden relationships

Every paper and note lives inside a project workspace

Your research stays grouped, searchable, and easy to revisit. Ada turns scattered reading into a coherent research system—so writing isn't a restart, it's the next step.

  • Project-based organization
  • Papers, notes, maps together
  • Full-text search across everything
arxiv.org/abs/1706.03762

Attention Is All You Need

Vaswani et al. · NeurIPS 2017

A
Ada
Saved
Added to "Thesis Research"
Highlight
"scaled dot-product attention"
Note
Key: enables parallelization
Project
My Research 11
Workspace
Library 11
Notes 3
Filters
Unread 11
Favorites
Has Notes
+ Add
Title Year Added
Attention Is All You Need
NeurIPS 2017
2017
Jan 23
The impact of economic crises on social inequalities...
International Journal for Equity in Health
2014
Jan 23
International trends and social disparities in pain...
Pain Research Journal
2019
Jan 23
Greater fluid overload and lower interdialytic weight...
Nephrology, Dialysis, Transplantation
2018
Jan 5
Notes + New
Notes: arXiv.org e-Print archive
10:12 PM
Notes: International trends and soci...
No content · Sat
Clip from International trends...
"non-representative samples..." · Jan 10
Notes: arXiv.org e-Print archive
B
I
U
H1
H2
Start writing... Type [[ to link notes
Literature Review: Attention Mechanisms
The transformer architecture introduced by Vaswani et al. (2017) fundamentally changed how we approach sequence modeling. Unlike recurrent architectures, transformers process all positions simultaneously.
Key innovations include scaled dot-product attention, which enables parallelization across the sequence, dramatically reducing training time.
A
Ada
Help me expand on the impact of self-attention
Based on your notes from "Attention Is All You Need": Self-attention allows each position to attend to all positions in the previous layer, enabling richer representations.
Ask Ada to help...

Attention Is All You Need

A
Ada
What does "scaled dot-product" mean?
It computes attention by taking the dot product of queries and keys, scaled by √d to prevent vanishing gradients.
From your highlight
Ask about this paper...
Self-Attention
Theme
Attention Is All You Need
Vaswani · 2017
BERT
Devlin · 2018
GPT-3
Brown · 2020
Vision Transformer
Dosovitskiy · 2020
Insight: Attention mechanisms unify NLP & Vision
Projects
Thesis Ch. 3
Survey Paper
Experiments

Thesis Ch. 3

12 papers · 34 notes
Paper
Attention Is All You Need
3 highlights
Paper
BERT: Pre-training...
5 highlights
Note
Lit Review Draft
Updated 2h ago
Note
Method notes
12 sources
Paper
GPT-3 Paper
Unread
Map
Research Map
6 connections

For the curious.

Ada is for the way research actually happens — nonlinear, iterative, and driven by curiosity.

Whether you're writing a thesis, synthesizing literature, building a new argument, or exploring an unfamiliar field, Ada keeps everything connected so your work stays coherent over time.

Built for Thesis Writers 200+ sources, long timelines
Built with researchers from

Research, without limits.

Ada is being built alongside researchers, academics, and thinkers at institutions like Harvard, Stanford, MIT and many more to redefine how we interact with information.

Free during beta · macOS first · Windows coming soon