PlainText.ink

Sovereign reading & writing — e‑Ink‑first.

Local‑first. Offline‑ready. Paperwork‑minimal. Works great on 480×800 portrait and 800×480 landscape.

Sample mode finishes with minimal tokens and still produces a valid EPUB.

Features

Textbook Builder

Generate EPUBs by grade, language, and topic. Pick Sample (fast) or Full (multi‑chapter).

Form Helper

Fill PDFs from simple YAML schemas. Auto‑fill with your local model; review and export.

Letter Explainer

Summarize letters and generate checklists. Offline‑ready, privacy‑first.

How it works

Your browser opens a local app at http://127.0.0.1:8000. That app talks to your on‑device LLM runtime (e.g., Ollama) on localhost. No cloud calls.

Install & run locally (macOS)

Ollama example — replace model as desired.

brew install ollama
ollama serve
ollama run llama3:8b

Clone & run BookDook (this repo):

git clone https://github.com/yosun/bookdook.git
cd bookdook/BookDook
python3 -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
export PLAIN_BACKEND=ollama
./.venv/bin/python cli/serve_frontend.py

Open the app: http://127.0.0.1:8000/layout/page.html

CLI quickstart

./.venv/bin/python cli/build_textbook.py --grade 5 --lang en --topic "Fractions" --mode sample
./.venv/bin/python cli/build_textbook.py --grade 5 --lang en --topic "Fractions" --mode full

FAQ

Can the hosted site use my local LLM directly? Not reliably. Open the local app above; it talks to your model securely on localhost.