Skip to content

fvciprian/deepseek-r1-ext

Repository files navigation

deepseek-r1-ext

This VS Code extension provides a chat interface for the DeepSeek-R1 model.
Currently it's using the deepseek-r1:7b which is a smaller model distilled from DeepSeek-R1 with better performance so it can be run on local hardware.

Features

  • Chat with the DeepSeek R1 model directly within VS Code.
  • Stream responses from the model in real-time.
  • Simple and intuitive webview interface.

Requirements

  • VS Code version 1.96.0 or higher.
  • Ollama to interact with the DeepSeek R1 model.

Installation

To install a .vsix file in VS Code:

Go to the Extensions view. Click Views and More Actions... Select Install from VSIX... or

in your terminal, run the following commands:

    code --install-extension deepseek-r1-ext-0.0.1.vsix

To make sure the model is downloaded run the following command in the terminal:

    ollama run deepseek-r1:7b

Usage

  • After the extension is installed you can start it by typing this command:

Deep Seek R1 Chat

  • Ask anything

Ask

Contributing

If you want to contribute to this extension or just want to use a different version of the deepseek-r1 model:

  • Install Ollama api
    npm install ollama
  • Check full list of models here
  • In extension.ts modify this line of code
        const streamResponse = await ollama.chat({
              model:'deepseek-r1:7b',
              messages: [{role: 'user', content: prompt}],
              stream: true
          });
    
  • Run and Debug extension

Run

  • Package extension

In VS Code terminal run:

    vsce package

Extension Settings

This extension does not add any VS Code settings through the contributes.configuration extension point.

Known Issues

  • None at the moment.

Release Notes

0.0.1

  • Initial release of DeepSeek R1 extension.

Enjoy!

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published