---
title: "Repository Overview"
description: "Walk the call path across README, manifest, events endpoint, app bootstrap, listeners, AI orchestration, and streaming. Observe the verbose logs and streaming behavior."
canonical_url: "https://vercel.com/academy/slack-agents/repository-flyover"
md_url: "https://vercel.com/academy/slack-agents/repository-flyover.md"
docset_id: "vercel-academy"
doc_version: "1.0"
last_updated: "2026-04-09T09:43:35.656Z"
content_type: "lesson"
course: "slack-agents"
course_title: "Slack Agents on Vercel with the AI SDK"
prerequisites:  []
---

<agent-instructions>
Vercel Academy — structured learning, not reference docs.
Lessons are sequenced.
Adapt commands to the human's actual environment (OS, package manager, shell, editor) — detect from project context or ask, don't assume.
The lesson shows one path; if the human's project diverges, adapt concepts to their setup.
Preserve the learning goal over literal steps.
Quizzes are pedagogical — engage, don't spoil.
Quiz answers are included for your reference.
</agent-instructions>

# Repository Overview

# How a Slack mention becomes an AI reply

When you mention your bot, the event routes through Nitro → Bolt → your listeners → AI orchestration. If you don't know where to add logs or fix bugs, you waste time. This lesson maps the complete path so you can debug fast and extend with confidence.

## Outcome

Understand where everything lives and trace a bot mention from HTTP entrypoint → AI reply.

## Fast Track

1. Trace the path: `events.post.ts` → `app.ts` → `app-mention.ts` → `createTextStream` → streaming response
2. Run `slack run` and mention the bot in Slack
3. Watch the streaming response in Slack UI and observe the verbose DEBUG logs in terminal

## Read the Code

Key files for tracing the event flow:

- `README.md`
- `manifest.json`
- `server/api/slack/events.post.ts`
- `server/app.ts`
- `server/listeners/*`
- `server/lib/ai/*`
- `scripts/*`

### How a Slack message travels

```
┌─────────────────────────────────────────────────────────────────┐
│                    Slack Event Flow                            │
└─────────────────────────────────────────────────────────────────┘

User sends message in Slack
        ↓
┌─────────────────┐    HTTP POST    ┌─────────────────┐
│     Slack       │ ──────────────→ │  events.post.ts │
│   Platform      │                 │  (Nitro route)  │
└─────────────────┘                 └─────────────────┘
                                            ↓ toWebRequest()
                                    ┌─────────────────┐
                                    │ VercelReceiver  │
                                    │   (Bolt)        │
                                    └─────────────────┘
                                            ↓ route event
                                    ┌─────────────────┐
                                    │ Event Listeners │
                                    │  (app-mention,  │
                                    │ direct-message) │
                                    └─────────────────┘
                                            ↓ fetch context
                                    ┌─────────────────┐
                                    │ createTextStream│
                                    │ (AI + tools)    │
                                    └─────────────────┘
                                            ↓ streamText()
                                    ┌─────────────────┐
                                    │client.chatStream│
                                    │ (Slack streaming)│
                                    └─────────────────┘
                                            ↓ for await...append
                                    ┌─────────────────┐
                                    │ Streamed Reply  │
                                    │ + feedback block│
                                    └─────────────────┘
```

## Try It

1. **Watch the streaming flow in action:**

   ```bash
   slack run
   ```

   Keep the process alive. Mention the bot in Slack: `@bot what's up?`

   **In Slack:** Watch the response appear word-by-word as chunks stream in. This is `client.chatStream()` delivering real-time updates.

   **In your terminal:** You'll see verbose DEBUG logs like this:

   ```
   [DEBUG]  bolt-app app_mention event received: {"type":"app_mention","user":"U09TJB25XQT",...}
   [DEBUG]  web-api:WebClient:0 apiCall('assistant.threads.setStatus') start
   [DEBUG]  web-api:WebClient:0 apiCall('conversations.replies') start
   [DEBUG]  bolt-app Active tools: Set(0) {}
   [DEBUG]  web-api:WebClient:1 ChatStreamer appended to buffer: {"bufferLength":2,...}
   [DEBUG]  web-api:WebClient:1 ChatStreamer appended to buffer: {"bufferLength":8,...}
   [DEBUG]  web-api:WebClient:1 apiCall('chat.startStream') start
   [DEBUG]  web-api:WebClient:1 apiCall('chat.stopStream') start
   ```

2. **Observe the flow (even if it's noisy):**
   - The first log shows the `app_mention` event with its full payload
   - `assistant.threads.setStatus` is the "is typing..." indicator
   - `conversations.replies` fetches thread context
   - `ChatStreamer appended to buffer` lines show text chunks accumulating
   - `chat.startStream` sends the first chunk to Slack
   - `chat.stopStream` completes the message with feedback blocks

\*\*Side Quest: Add Correlation Logging\*\*

## Commit

```bash
git add -A
git commit -m "docs(architecture): understand event flow and streaming architecture

- Trace Slack events from HTTP entry to streaming AI response
- Observe verbose DEBUG logs and identify pain points
- Map file responsibilities across the codebase"
```

## Done-When

- [x] Can trace an event from `events.post.ts` → `app.ts` → listener → `createTextStream` → streaming response
- [x] Observed streaming behavior in both Slack UI and terminal logs
- [x] Understand the role of each major directory and file
- [x] Recognize that logging needs improvement (sets up SideQuest motivation)


---

[Full course index](/academy/llms.txt) · [Sitemap](/academy/sitemap.md)
