Large Language Model (LLM) Coding Assistance

by
Tags:
Category:

Note: It has been about three months since this was originally written, so there is a certain amount of information that is out of date. See the addendum for updated information.


With all the hype surrounding Generative AI/LLM, and all the hallucinations mentioned in the news, what are these actually good for?

As it turns out LLMs trained for code generation are helpful. But what if you don’t want your code going to some cloud provider? The following is a great solution for that.

Here is the plan:

  • Install Ollama and load the model
  • Install Continue
  • Try it out
  • Conclusion

Install ollama and load the model

Ollama allows you to run models locally:

  • Install ollama from https://ollama.com/
  • Open a terminal window (these commands were run on an M1 MacBook Pro)
  • Run ollama list (This should start ollama and show that no models are available)
  • Run ollama pull g1ibby/deepseek:6.7b (that is a one after the g)
  • When this completes run ollama list again and you should see
NAME                    ID              SIZE      MODIFIED
g1ibby/deepseek:6.7b    f7f889d53789    3.8 GB    15 seconds ago

Install Continue

Continue allows an IDE to access a model running in ollama:

  • In VSCode, install the Continue plugin from Extensions sidebar (https://continue.dev/)
  • Open the Continue sidebar
  • Select the Config gear
  • And replace the existing “models” section with
  allowAnonymousTelemetry": false,
  "models": [
    {
      "title": "DeepSeek 6.7B",
      "provider": "ollama",
      "model": "g1ibby/deepseek:6.7b"
    }
  ],
  • Note: Setting allowAnonymousTelemetry keeps the model completely offline. In some installs allowAnonymousTelemetry already exists and the existing line would need to be removed.
  • Select DeepSeek 6.7B in the dropdown to the left of the Config gear
  • You may also want to move Continue to the right sidebar

Try it out

Let’s say you want to store some data in PostgreSQL.

  • Press Command+L (Control+L on Linux). This will open a new Continue session (The extension also has an IDE-specific tutorial for IntelliJ showing keyboard shortcuts.)
  • Type: Add SQL for the postgreSQL database to create a table with the following 4 columns: a uuid primary key, a required first name, a required last name, and an optional note

This will generate something like:

Here is a simple example of how you could create such a table in PostgreSQL using SQL commands:

CREATE TABLE users (
    id UUID PRIMARY KEY DEFAULT uuid_generate_v4(),  -- Generate a new unique UUID for each row
    first_name VARCHAR(50) NOT NULL,                   -- Required first name
    last_name VARCHAR(50) NOT NULL,                    -- Required last name
    note TEXT                                           -- Optional note
);
In this script:

 * UUID PRIMARY KEY DEFAULT uuid_generate_v4() creates a column of UUID type as the primary key. The function uuid_generate_v4() is used to generate a new unique UUID for each row.
 * VARCHAR(50) NOT NULL defines a string field that can't be null and has a maximum length of 50 characters.
 * TEXT is just the data type for text fields, which can store large amounts of text data (up to ~1GB).
Please replace the column names and types with your preferred ones if needed. The above script assumes you have the UUID extension installed in PostgreSQL. If not, you would need to install it first using CREATE EXTENSION IF NOT EXISTS "uuid-ossp";.

Usually there is something wrong with the answer (such as gen_random_uuid now being a standard part of PostgreSQL 16 so no extension is needed), but it usually seems to be close enough to be helpful to experienced developers and save time. The time it takes on an M1 Mac seems to vary between almost instant to a few seconds.

A function can also be put directly into the code by prefixing the query with /edit for example inside a typescript file

  • Select a blank line where you want to insert the function and press Command+Shift+L
  • Type /edit create a function to remove all spaces from a string

And about a second later you have:

function removeSpaces(str) {
  return str.replace(/\s/g, '');
}

(again not quite correct – bad variable name, missing a null check, and missing the type)

Conclusion

Quick code generation using LLMs seems like a useful helper for any developer, although everything that it generates needs to be verified and tested. I need to spend a lot more time with this to see how it performs with different coding tasks, and I’m planning on sharing these in future posts.

Addendum

Continue is now much easier to use and there are newer models available to run code generation locally. The latest version of Continue also has inline code completion using the starcoder2 model.

ollama

  • Install ollama from https://ollama.com/
  • Open a terminal window (these commands were run on an M1 MacBook Pro)
  • Run ollama ls This should start ollama and show that no models are available
  • Run ollama pull deepseek-coder:6.7b
  • Run ollama pull starcoder2:3b
  • When this completes run ollama ls again and you should see
NAME                    ID              SIZE      MODIFIED
starcoder2:3b          f67ae0f64584    1.7 GB    29 seconds ago
deepseek-coder:6.7b    ce298d984115    3.8 GB    About a minute ago

Continue

  • In VSCode, install the Continue plugin from Extensions sidebar (https://continue.dev/)
  • Open the Continue sidebar (Command+L can be used if it is not easy to find)
  • The models will be loaded automatically. This may take a minute.
  • Select Ollama – deepseek-coder:6.7b from the dropdown
  • Select the Config gear
  • Change allowAnonymousTelemetry to false

Conclusion

Generative AI for code generation is more popular now than three months ago. And protecting digital assets will always be important. Running code generation 100% offline is a safe way to benefit from Generative AI while avoiding the pitfalls of sharing code in the cloud. The Continue plugin, which continues to improve, is a good way to accomplish this.