OpenCode and Oh-My-OpenCode Setup Guide: Connect GitHub Copilot Models and Java jdtls

OpenCode with Oh-My-OpenCode unifies multi-model access and language server capabilities inside a terminal-first workflow. It solves three common problems: fragmented AI coding tools, missing Java IntelliSense, and cumbersome model switching. Keywords: OpenCode, GitHub Copilot, jdtls.

The technical specification snapshot clarifies the stack.

Parameter Details
Core tools OpenCode, Oh-My-OpenCode
Runtime Node.js 18+
Primary protocols CLI, LSP, STDIO
Target models GitHub Copilot model family
Java support Eclipse JDT Language Server (jdtls)
Typical systems Windows, with similar steps on macOS/Linux
Star count Not provided in the source material
Core dependencies npm, PowerShell, JDK 21+

This setup fits developers who need a unified AI coding entry point.

OpenCode provides terminal-native AI coding capabilities, while Oh-My-OpenCode handles model orchestration, category-based routing, and extended configuration. Together, they can call GitHub Copilot models and connect to LSP services for Java completion, navigation, and refactoring.

The real value of the source material is not conceptual explanation. It is executable configuration. In practice, three pain points appear most often: unstable model authorization, tedious jdtls installation, and error-prone JSON configuration. The sections below reorganize the process into four stages: installation, model setup, LSP integration, and troubleshooting.

Start by preparing the base environment.

The recommended environment is Windows with Node.js 18+, PowerShell, and JDK 21+. If you plan to use GitHub Copilot models, you also need an active subscription and completed authentication.

npm i -g opencode-ai
npm i -g oh-my-opencode
npx oh-my-opencode install
opencode --version

These commands install OpenCode and its enhancement layer, then verify that the CLI is available.

After installation, organize the model configuration file first.

The key Oh-My-OpenCode configuration file is typically located at .config/opencode/oh-my-openagent.json under your user directory. If the directory does not exist, create it manually. This file defines agents, categories, and later the LSP registration settings.

A reliable approach is to use GitHub Copilot as the primary model provider, route high-capability tasks to gpt-5.4, send fast tasks to gpt-5.4-mini, and use fallback_models as a safety net so the entire workflow does not fail when the primary model becomes unavailable.

{
  "$schema": "https://raw.githubusercontent.com/code-yeongyu/oh-my-openagent/dev/assets/oh-my-opencode.schema.json",
  "agents": {
    "sisyphus": {
      "model": "github-copilot/gpt-5.4",
      "variant": "max",
      "fallback_models": [
        { "model": "github-copilot/gpt-5.4", "variant": "max" },
        { "model": "github-copilot/gpt-5.4" },
        { "model": "github-copilot/gpt-5.4", "variant": "medium" }
      ]
    }
  },
  "categories": {
    "quick": {
      "model": "github-copilot/gpt-5.4-mini",
      "fallback_models": [
        { "model": "github-copilot/gpt-5.4-mini" },
        { "model": "github-copilot/claude-haiku-4.5" }
      ]
    }
  }
}

This configuration defines the routing strategy for the primary agent and the quick-task category.

The key to model configuration is not quantity, but fallback behavior.

variant defines the capability tier, which helps separate workloads by task complexity. fallback_models determines whether the system can automatically switch to a secondary option when authorization fluctuates, the network becomes unstable, or the model is rate-limited.

If model calls fail, first verify that GitHub Copilot is authenticated and that your network allows access to the required endpoints.

Java LSP integration determines whether this setup works for real engineering tasks.

For Java, the most practical choice is jdtls, the most mature option in the official ecosystem. It exposes semantic completion, navigation, diagnostics, and some refactoring features through the LSP standard. On the OpenCode side, you only need to launch it with --stdio.

The source material makes one requirement explicit: JDK must be configured as a global environment variable, and the version must be 21 or later. This is a hard prerequisite for jdtls startup.

# install-jdtls.ps1
$installDir = "$env:USERPROFILE\.local\share\opencode\bin"
$tempFile = "$installDir\jdtls.tar.gz"
New-Item -ItemType Directory -Force -Path $installDir | Out-Null
Invoke-WebRequest -Uri "https://download.eclipse.org/jdtls/milestones/1.57.0/jdt-language-server-1.57.0-202602261110.tar.gz" -OutFile $tempFile
Set-Location $installDir
tar -xzf jdtls.tar.gz
$jdtlsBinDir = "$installDir\bin"
$currentPath = [Environment]::GetEnvironmentVariable("Path", "User")
if (-not $currentPath.Contains($jdtlsBinDir)) {
  [Environment]::SetEnvironmentVariable("Path", "$currentPath;$jdtlsBinDir", "User") # Add the jdtls directory to the user PATH
}
Remove-Item $tempFile -Force
& "$jdtlsBinDir\jdtls.bat" 2>&1 | Select-Object -First 3 # Verify that the command can start

This script downloads jdtls, extracts it, adds it to PATH, and performs a minimal executable check.

Reopen the terminal after installation to validate PATH.

Many users assume the script failed, but the real issue is that PATH changes have not yet taken effect in the current session. Open a new PowerShell window and run the following command.

jdtls -help

This command confirms that the system can resolve jdtls and return its help output.

Java files activate the language server only after you register jdtls in the configuration file.

Add an lsp field at the top level of oh-my-openagent.json. Three details matter most: the command must be jdtls --stdio, the file extensions must include .java, and you can relax memory limits with NODE_OPTIONS when necessary.

{
  "lsp": {
    "jdtls": {
      "command": ["jdtls", "--stdio"],
      "extensions": [".java"],
      "priority": 10,
      "env": {
        "NODE_OPTIONS": "--max-old-space-size=4096"
      },
      "initialization": {
        "preferences": {
          "includeInlayParameterNameHints": "all"
        }
      }
    }
  }
}

This configuration tells OpenCode to start jdtls in standard input/output mode when a Java file opens and to enable parameter name hints.

Frontend projects can extend language servers with the same pattern.

If you also maintain Vue projects, install the official Vue language server first, then add it to the lsp field using the same pattern as jdtls. The configuration structure stays the same. Only command and extensions differ.

npm install -g @vue/language-server

This command installs the Vue language server so you can later extend frontend completion support.

Most common issues come from environment variables, JSON syntax, and authorization state.

Messages related to JAVA_TOOL_OPTIONS or incubator modules are usually warnings, not errors. As long as you do not see command resolution failures or startup interruption, they should not affect normal jdtls usage.

Another frequent problem is malformed JSON. After every change to oh-my-openagent.json, validate the file structure and pay special attention to commas, array closures, and object nesting.

FAQ

Q1: Why does jdtls print warnings but still count as a successful installation?

A: Many messages are only JDK runtime warnings and do not indicate an LSP startup failure. If jdtls -help returns usage information, the command path is usually working correctly.

Q2: Why is OpenCode installed, but Java files still have no completion?

A: The three most common causes are: jdtls is not installed or not globally discoverable, the lsp configuration in oh-my-openagent.json contains a syntax error, or the opened file extension does not match .java.

Q3: Why are GitHub Copilot models occasionally unavailable?

A: The usual causes are authorization state, network connectivity, or model rate limits. Configure fallback_models so the system can automatically fall back to backup models when the primary one fails.

Core summary: This guide reconstructs the full OpenCode and Oh-My-OpenCode configuration workflow, covering installation, GitHub Copilot model integration, jdtls registration, Vue LSP extension, and common troubleshooting steps to help developers quickly build a usable AI coding workflow on Windows.