Paladin IDE release and next version plans

Artificial Intelligence LLM model as a plugin in Paladin

The programming environment associated with Paladin uses external programs to collaborate on creating code such as PE or Koder. Why not create a “guru” prompter from the LLM model as a separate program or part of Paladin.

The model was MS Visual code with Copilot and mushrooming competing programs such as Cursor and Trae. The problem is that LLM models are paid, there are also open source and a way to train the Haiku API model.

This is a sketch of the concept, I hope something will move in this topic.

2 Likes

Does Paladin have an extendable architecture?
You need to implement that first.

OpenRouter exposes the OpenAI API and seems to have some free of charge models, I haven’t tried it though.

This is not like a walk in the park, fine-tuning an existing model requires significant machine learning skills because a new training set must prepared, tagged and split in the training data themselves and a test set to avoid data leakage.
Moreover, the appropriate strategy should be selected (supervised vs unsupervised, transfer learning, etc.) and likely the hyperparameters should be adapted, too.
Fine-tuning brings several risks like overfitting and catastrophic interference and requires several iterations. It’s a painful process.

Someone did a great job creating Amiga Guru but it’s a hosted OpenAI model which leveraged their infrastructure for fine tuning.

Cursor, Cline & C. mainly rely on Claude 3.7 Sonnet nowadays. From my tests, this is the only model that has a significant and reliable knowledge of the Haiku API.

Why not using Claude Code from the terminal instead or Anon Kode?

Check out https://simonwillison.net/2025/Mar/11/using-llms-for-code/ for a useful article about LLM’s and coding.