About LanJAM

LanJAM is a private, self-hosted AI chat application built for families who want to explore artificial intelligence together — safely and without giving up their privacy.

What is LanJAM?

Think of LanJAM as your family's own private ChatGPT. It runs entirely on your home network — no cloud accounts, no subscriptions, no data leaving your house. Every family member gets their own account, and all conversations stay completely private.

LanJAM uses open-source AI models (via Ollama) that run directly on a computer in your home. You chat through your web browser — there's nothing to install on phones, tablets, or laptops. Just open the address and start talking.

Why we built it

AI is an incredible tool for learning, creativity, and productivity. But most AI services require sending your conversations to the cloud, where they may be stored, analysed, or used for training. For families — especially those with children — this raises real concerns about privacy and content safety.

LanJAM was created to solve this. It gives families a way to use AI that is:

Completely private

Nothing leaves your network

Safe by design

Age-appropriate rules for children and teens

Offline capable

Works without internet once set up

Multi-user

Each person gets their own isolated space

How it works

1

Install LanJAM on a home computer

Download the installer and run it on any machine — a desktop, laptop, or even a Raspberry Pi. LanJAM runs as a web server accessible to all devices on your network.

2

Download an AI model

LanJAM uses Ollama to run AI models locally. Pick a model from the admin panel and it downloads straight to your machine — no cloud API key needed.

3

Add family members and start chatting

Create accounts for everyone, assign age roles for automatic safety rules, and you're good to go. Everyone opens their browser, picks their name, and starts chatting.

Technical overview

LanJAM is a full-stack web application built with React Router, PostgreSQL, and MinIO for file storage. It communicates with Ollama for AI model inference and optionally uses Whisper for voice-to-text. The entire stack runs locally via Docker Compose.

Open source

LanJAM is free and open source. You can inspect every line of code, contribute, or fork it for your own needs.