Rallying Scala (Metals) LSP native support in Claude Code

Hi everyone,

I’d like to bring attention to a feature request I opened on the Claude Code repository: [FEATURE] Add Scala (Metals) to the native LSP plugin lineup · Issue #45132 · anthropics/claude-code · GitHub — requesting native Metals LSP integration for Scala.

Claude Code currently ships with native LSP plugins for 12+ languages, including TypeScript, Rust, Go, Python, Java (jdtls), Kotlin, C#, Ruby, PHP, Swift, Lua, and C/C++. Scala is notably absent from this list, despite Metals being a mature, well-established LSP server that follows the exact same protocol as the already-supported language servers.

Without native LSP support, Claude Code falls back to text-based search (grep) when navigating Scala codebases. This is especially painful for Scala because we have, arguably, one of the most expressive type systems out there. Implicits, extension methods, type aliases, and givens mean grep misses call sites and returns false positives constantly. Semantic intelligence via Metals would directly improve code navigation accuracy.

There are workarounds like Serena, an MCP server that wraps Metals, but its lookups introduce significant overhead.

  • Metals uses the standard LSP protocol — the same one every other supported language server uses.
  • JVM infrastructure already exists — Claude Code already supports Java (jdtls) and Kotlin (kotlin-lsp), so the JVM runtime plumbing is in place.
  • Installation is trivialcs install metals, same as any other Coursier-based tool.
  • Implementation would be structurally identical to existing plugins.

In short, this is low-hanging fruit from an implementation standpoint, but high impact for Scala developers using AI-assisted tooling.

Urgent: the issue needs engagement

Anthropic’s bot has flagged the issue as a potential duplicate of two older, more generic requests (#24735, #24734). Hopefully this issue makes a targeted, well documented case for Metals support.

How you can help

  1. Go to the issue and thumbs-down :-1: the bot’s duplicate comment to prevent auto-closure.
  2. Thumbs-up👍 the issue itself to signal demand.
  3. Leave a comment sharing your use case — the more Scala-specific context Anthropic sees, the stronger the case.

Link: [FEATURE] Add Scala (Metals) to the native LSP plugin lineup · Issue #45132 · anthropics/claude-code · GitHub

Thanks for your time.

6 Likes

Thanks for looking into this! The reason why I haven’t looked into this is because I feel like dedicated MCP like the one Metals offers itself is more suited towards agents.

It’s also quite easy to setup currently either when running Metals itself with the MCP server option turned on running a standalone MCP server installable via:

cs install metals-mcp

That said, it might be worth to also mention metals LSP on the webpage to give it more exposure and to Scala overall as well. Especially, that we can push for a safer LLM code with Scala, so it’s worth putting it out there as much as possible.

5 Likes

I ran some benchmarks and reported my findings in this comment [FEATURE] Add Scala (Metals) to the native LSP plugin lineup · Issue #45132 · anthropics/claude-code · GitHub

In short, Metals itself is fast, the problem seems to be something related to the transport layer of Claude’s MCP. I probably misjudged Serena as well, it was probably Claude MCP all along.

In any case, querying the metals-mcp directly will be my go to option for now, I’ll just tell claude to use it that way.

1 Like

Was that part of the comment clowned like the rest of the Clown Code LSP “support”?

If nothing changed since last week when I’ve tried the Java integration the last time all that stuff does actually not work; it’s a complete clown show!

This nonsense just downloads another instance of the JDTLS and tries to run it without any config. Which of course fails miserably! All that while there is a fully working JDTLS already in place in the Java dev tools, but Clown Code isn’t able to use it…

This whole shit was never tested, it’s just the result of some Clown Code vibes. Of course, exactly like the whole rest of Clown Code.

Besides that LSP is a terrible protocol for LLMs. They can’t properly handle it. Whether invoked through Clown Codes “native” support or through a MCP server makes no real difference.

For proper Scala (and Java) IDE support you need a tight integration between the LSP, the BSP, and the DAP.

Nothing like that exist right now, and all the (usually vibe coded) “solutions” I found of course don’t work properly.

The first step to fix that issue would be to actually define some “natural language” “protocol” for LLMs to be able to actually use IDE tools efficiently. The real IDE protocols aren’t LLM friendly—and MCP based “solutions” need very heavyweight wrappers, which also eat a lot of tokens and context as they need very sophisticated instructions to the LLM to work more or less at all. (Which just amplifies the problem that more context isn’t necessary good as it “distracts” the models, mentioned here: [2602.11988] Evaluating AGENTS.md: Are Repository-Level Context Files Helpful for Coding Agents? )

Because it’s the era of outsourcing thinking to token generators let’s just use “AI” to explain my claim that current IDE tooling is very problematic for LLMs:

https://claude.ai/share/99669d1b-323d-4507-971c-78cf7a516aaf

1 Like

Please read what I posted in Anthropics issue as a follow up message (the part about using the LSP directly via HTTP instead of relying on Claude’s MCP client).

It works really well, actually. At least it works for me, maybe your use case is more strict/demanding.

3 Likes

Can it install the lsp to copilit cli too, thanks

Do you mean Coopilot CLI ? It should be possible after installing metals and following Adding MCP servers for GitHub Copilot CLI - GitHub Docs

Was that part of the comment clowned like the rest of the Clown Code LSP “support”?

@MateuszKowalewski While I understand the frustration when those tools don’t work and the overall state of current programming world, let’s try to stay on point and avoid loaded language. You can always just say that LSP plugins do not work for you and explain the reasons for it without venting your frustration. It doesn’t add to a constructive discussion.

9 Likes

Do you manage to get Claude use LSP directly instead of going through MCP?

Yeah, I basically just told the agent to not use his MCP tools and instead just query the lsp directly using curl.

Or actually, I mean, he uses MCP, just not through his standard tools. Wonder if one can just instruct to claude to talk LSP directly? Haven’t tried that

For exact snippets (or a very good approximate) of what I have in my AGENTS.md, I’ve set up a public repo at GitHub - NovaMage/agents-metals-direct-lsp: Just a sample file showing AGENTS.md notes so that the agent uses metals lsp without going through MCP · GitHub

It’s basically what I have minus any private projects information.

3 Likes

@tgodzik you’re right.

I’m sorry! :person_bowing:

I should have pressed “draft” instead of “send” and revisit this not before one or two good night’s sleep.

I’m indeed a bit frustrated currently as I’m knee deep in some horrible Java spaghetti, and spent about 3/4 of a day trying to make the advertised LSP integration into Claude Code work, but gave up after some time as I’ve seen that it would need quite some effort as nothing worked like written in the obviously hallucinated “docs”. (Claude is fully vibe coded, that’s no secret…)

My use-case isn’t anyhow “demanding”. All I wanted is to have LSP support for Java using the Claude extension in VSCode.

It does not work for me, even after some deep dive.

How did you manage to get it working? This would be really helpful for me!

I mean, one could set up the whole external JDTLS fully manually. But I didn’t go into that rabbit hole (seeing that was actually the point where I’ve gave up). Such setup would be anyway problematic, I think, as it would mean that 2 JDTLS would concurrently run on the same code base—as I see no way to make the Claude Code VSC extension use the Java language tools already running inside VSC.

After I’ve read a bit about how the LSP support actually works, and what this means for LLMs, I actually think @tgodzik is also right with one of his first comments here in that regard: It’s likely overall better to route the LSP integration over MCP. This offers for example the chance to add custom instructions for the agents (and they really benefit from that, for the reasons pointed out by the AI itself; even this means context pollution and “token waste”).

Once more, sorry for the tone, this was a mistake. I try to better avoid that in the future.

But the “AI situation” is really frustrating. Completely overblown promises while basically nothing works correctly or reliably. This, while I was always dreaming of being able to talk to the computer like they did in Star Trek. Child dreams completely destroyed…

2 Likes

@MateuszKowalewski I’ve set up a repository at GitHub - NovaMage/agents-metals-direct-lsp: Just a sample file showing AGENTS.md notes so that the agent uses metals lsp without going through MCP · GitHub with a sample fragment you can use for your CLAUDE.md ( symlinked to AGENTS.md if you follow the industry standard ) so that it can use metals-mcp while avoiding this unnecesary overhead.

That’s for scala, but I’m sure you could do something similar for java. Please look at the benchmarks, it makes a huge diference

Thanks for the reply!

But that’s not really solving the issue as the Java LSP integration does not work at all. It calls the tool, but the tool just bails out as the JDTLS isn’t configured properly.

The “official” support is plain broken, and can’t work the way it’s done. It’s just some “vibe-copy-pasted” thing from the other LSP integrations which don’t need extensive config to work, it seems, something which is completely missing from that “official” Java LSP Claude thingy.

I’ve also tried some external things, some “LSP-MCP-bridges”, but all the stuff I found so far is also just some vibe-coded things which do not work.

I’m back to let the agent grep around the code base, which for Java kind of “works” as Java is less demanding, but it’s still pretty broken overall, the agents miss a lot of stuff this way.

In case someone has some working Java LSP Claude integration (I don’t care whether directly the “official” LSP integration, or through some MCP-bridge) I would be very grateful to learn how this works. (Of course nobody needs to google for me, I did that already, and wasted quite some time on that.)

1 Like

There are more important infomation about the root cause covered inside the repo’s readme about claude tool-turns but for anyone landing here from the thread: I generalized the same direct-HTTP pattern Mago demonstrated for metals-mcp across Python, TypeScript, C#, Vue, and Scala “my current working stack” — GitHub - magaransoft/claude-lsp-direct: Claude Code LSP acceleration — per-workspace HTTP proxies that sidestep the per-tool-call round-trip for any language server. ~100× faster perceived latency vs native LSP(). Ships wrappers for Python, TypeScript, C#, Vue, Scala; extensible to any LSP server. · GitHub .

Warm calls at ~0.07s vs ~9s for Claude Code’s native LSP() (same ~100x ballpark as his 230x Scala measurement). Per-workspace HTTP spawn also fixes csharp-ls’s rootUri-at-init binding, which was blocking cross-.NET-project navigation mid-session.

MIT, install.sh-driven (symlinks into ~/.claude/bin/ + merges settings). README credits @NovaMage’s benchmark + links back to his repo. Template doc in CONTRIBUTING.md for anyone wanting to add a language.

@tgodzik thanks btw.

@MateuszKowalewski Your Java mention pushed this forward. claude-lsp-direct now ships a java-direct wrapper backed by jdtls, same per-workspace HTTP-proxy pattern as the others. Warm calls land sub-100 ms on the bundled fixture; cold start on a real Maven/Gradle project pays the usual JDT.LS dependency-resolution tax once, then warm thereafter.

PR: add java-direct (jdtls) wrapper — v1.1.0 by Blanquitoh · Pull Request #1 · magaransoft/claude-lsp-direct · GitHub

If you give it a spin, I’d appreciate your numbers on a real codebase — cold start, warm round-trip, and which methods you end up leaning on most. That’s the signal that would shape the next iteration.

Sorry for the delay, but ironically we published the latest comments at the same time. But here you go.

3 Likes