Banana Code Blog

Release notes and deeper dives. More posts will show up here over time.

All posts

April 23, 2026 · Release

BananaCode v2.4.0: DeepReview & Enhanced Personalization

2.4.0 Release Preview

BananaCode v2.4.0 is here, focusing on giving users more control over how the AI interacts and introducing a powerful new audit mode.

🔍 DeepReview: Full Codebase Audit

The new /deepreview command switches BananaCode into a specialized review mode. You can choose between a Full Review (auditing the entire current codebase) or a Diff Review (reviewing only staged/unstaged changes via git diff). In this mode, BananaCode focuses purely on providing a structured report with Critical, Warning, and Suggestion findings, without making any file modifications.

✨ Emoji & Style Personalization

We've added more ways to customize your AI pair programmer's personality:

🛠️ New Tools & UI Polish

Bug Fixes & Reliability

We've improved tool execution error handling to better manage user cancellations and repair dangling tool calls. Additionally, the startup telemetry now correctly uses https for more secure connections.

Update now with npm install -g @banaxi/banana-code and try out the new /deepreview command!

April 2026 · New Feature

Local Intelligence: LM Studio Support is Here!

LM Studio Support Preview

Banana Code has always been about flexibility, and today we're taking a huge leap towards local-first development. We are excited to announce full, first-class support for LM Studio.

Why LM Studio?

LM Studio has become the go-to tool for running large language models (LLMs) locally on your own hardware. By integrating LM Studio, Banana Code users can now leverage powerful models like Llama 3, Mistral, and many others without needing an API key or an active internet connection for the model inference.

First-Class Features

This isn't just a simple proxy; we've implemented a full provider suite tailored for the local experience:

Getting Started

Switching to LM Studio is simple. Just run the following command in your terminal:

/provider lmstudio

Banana Code will ask for your local server URL (defaulting to http://localhost:1234/v1) and then let you pick from your loaded models. You can also configure it during initial setup with banana --setup.

Optimized for Performance

We've included automatic JSON schema sanitization for local models, ensuring that even strict local inference engines can understand and use Banana Code's tool definitions without errors.

Download Banana Code using npm install -g @banaxi/banana-code and then download LM Studio at lmstudio.ai and start coding locally today!

April 2026 · Release

2.0.0 Released, What changed?

2.0.0 Release Preview

Version 2.0.0 is a major step forward for Banana Code as a terminal-native AI pair programmer. Here is a concise tour of what shipped, aligned with the actual app behavior.

Smarter Auto Mode (model + effort)

When you pick Auto Mode as your model, a small router model still picks the best concrete model for each user turn—but for Claude, it now also selects a reasoning effort level (low through max, including xhigh where supported). That keeps simple questions cheap and fast while reserving depth for hard tasks. Use /effort to adjust effort manually when you are on Claude.

Interactive terminal suite

Banana Code moves beyond one-shot shell runs. New tools drive a persistent PTY:

Together, these let the agent work through flows that used to stall on non-interactive runners—while one-off tasks still use execute_command.

Financial intelligence

For providers that expose usage (notably Anthropic), the app tracks real session spend and estimates what you saved with Prompt Caching. Run /context for a breakdown (messages, estimated tokens by category, cost, cache savings). On exit, you get a final session cost summary when costing is available.

Skill Creator mode

New command /skill-creator switches the assistant into a mode that helps you author Agent Skills: structured SKILL.md files with YAML frontmatter, written under ~/.config/banana-code/skills/<skill-name>/. The status bar shows SKILL CREATOR MODE; return with /agent.

New slash commands and style

Built-in docs for the model

The get_banana_docs tool gives the model a reliable summary of Banana Code (plus README when present), so answers about slash commands and setup stay accurate.

UltraMemory (optional)

Enable UltraMemory under /settings to run background summarization of eligible chats into global memory. It can significantly increase API usage; the CLI asks for confirmation before turning it on, and only processes activity after you enable it.

Richer @-mentions

File mentions support quoted paths (spaces), ~ expansion, and attaching images via @@path for multimodal providers.

Headless API security

banana --api now uses a generated API token stored at ~/.config/banana-code/token.json. HTTP requests need Authorization: Bearer <token> or ?token=; WebSockets should connect with ?token=... unless you explicitly use --no-auth (discouraged).

Other polish

Install or upgrade with npm install -g @banaxi/banana-code and read the full docs on the Docs page.