- Generate commit messages directly from the Source Control view.
- Choose your generator: ChatGPT, Gemini, Ollama, LMStudio, Smithery, or a custom endpoint.
- Multilingual output: English, Japanese, Korean, German, and Russian.
- Clean formatting controls: emojis, multiple candidates, and scope formatting with file extensions.
- Fine-tune outputs with model, temperature, max tokens, and endpoint settings.
- Install ProCommit from the Marketplace or OpenVSX.
- Open a Git repository in VS Code.
- Configure your generator and credentials:
- Command Palette โ
ProCommit: Set API key(if your generator requires one) - Or Settings โ
procommit.apiKey
- Command Palette โ
- Generate a message:
- Source Control view โ click
Generate ProCommit - Or Command Palette โ
ProCommit: Generate ProCommit
- Source Control view โ click
- Depending on the generator you select, you may need an API key (for example, OpenAI / Gemini / Smithery).
- For local generators (like Ollama or LMStudio), you typically only need a reachable endpoint.
- Download a VSIX from the Direct Link.
- In VS Code, open Extensions โ
...โ Install from VSIXโฆ - Select the downloaded VSIX file.
These keys can be changed in VS Code Settings UI or in your settings.json.
| Setting | Default | Description |
|---|---|---|
procommit.general.generator |
ChatGPT |
Generator used to create commit messages. Options: ChatGPT, Gemini, Ollama, LMStudio, Smithery, Custom. |
procommit.general.messageApproveMethod |
Quick pick |
How you approve and apply the generated message. Options: Quick pick, Message file. |
procommit.general.language |
English |
Language used for generated commit messages. |
procommit.general.showEmoji |
false |
Include emojis in commit messages. |
procommit.general.useMultipleResults |
false |
When enabled (and using Quick pick), shows multiple generated candidates. |
procommit.general.includeFileExtension |
true |
Include file extensions in commit scope (for example, app.js vs app). |
procommit.apiKey |
empty | API key for generators that require authentication. |
procommit.endpoint |
empty | Custom endpoint URL for generators. Leave blank to use the generator default. |
procommit.model |
empty | Optional model identifier/version. Leave blank to use the generator default. |
procommit.temperature |
0.2 |
Output randomness (lower is more deterministic). |
procommit.maxTokens |
196 |
Maximum tokens used for generation. |
{
"procommit.general.generator": "ChatGPT",
"procommit.general.language": "English",
"procommit.general.messageApproveMethod": "Quick pick",
"procommit.general.showEmoji": false,
"procommit.general.useMultipleResults": true,
"procommit.general.includeFileExtension": true,
"procommit.apiKey": "",
"procommit.endpoint": "",
"procommit.model": "",
"procommit.temperature": 0.2,
"procommit.maxTokens": 196
}Available in the Command Palette:
ProCommit: Generate ProCommitProCommit: Set API keyProCommit: Set GeneratorProCommit: Set LanguageProCommit: Set Message Approve MethodProCommit: Set Include File Extension in ScopeProCommit: Set Model VersionProCommit: Set Custom EndpointProCommit: Set TemperatureProCommit: Set Max Tokens
- No button in Source Control: ensure
scmProvider == gitand a Git repository is opened. - API errors: verify
procommit.apiKey,procommit.endpoint, andprocommit.modelmatch your selected generator. - No changes detected: stage or modify files before generating; ProCommit uses your repository diff to create a message.
Released under the MIT License by @Kochan/koiisan.
Feature requests and new language support are welcome. Please open an issue on the GitHub repository.
