Skip to content

revert: remove adapters / intrinsics / alora / lora from openai code#543

Merged
jakelorocco merged 6 commits intomainfrom
jal/remove-vllm-adapters
Feb 17, 2026
Merged

revert: remove adapters / intrinsics / alora / lora from openai code#543
jakelorocco merged 6 commits intomainfrom
jal/remove-vllm-adapters

Conversation

@jakelorocco
Copy link
Contributor

@jakelorocco jakelorocco commented Feb 12, 2026

Misc PR

Type of PR

  • Bug Fix
  • New Feature
  • Documentation
  • Other

Description

  • Link to Issue: N/A- Request to remove alora support from openai backend for vllm servers since vllm will not support it anymore.

Currently removed all adapter based code; will confirm if that's the intent or if we should just remove support for activate loras while still allowing for lora adapters. Consensus was to just remove all support and keep huggingface as the backend that supports adapters/aloras/intrinsics. The logic for this is that the vllm implementation that will eventually support aloras / intrinsics / arbitrary adapters will likely be a much different implementation. As a result, we are removing this code.

Testing

  • Tests added to the respective file if code was changed
  • New code has 100% coverage if code as added
  • Ensure existing tests and github automation passes (a maintainer will kick off the github automation when the rest of the PR is populated)

@github-actions
Copy link
Contributor

The PR description has been updated. Please fill out the template for your PR to be reviewed.

@mergify
Copy link

mergify bot commented Feb 12, 2026

Merge Protections

Your pull request matches the following merge protections and will not be merged until they are valid.

🟢 Enforce conventional commit

Wonderful, this rule succeeded.

Make sure that we follow https://www.conventionalcommits.org/en/v1.0.0/

  • title ~= ^(fix|feat|docs|style|refactor|perf|test|build|ci|chore|revert|release)(?:\(.+\))?:

@jakelorocco jakelorocco force-pushed the jal/remove-vllm-adapters branch 8 times, most recently from 777d43d to 54262dc Compare February 12, 2026 20:01
@jakelorocco
Copy link
Contributor Author

Note: while reverting these changes, I found an issue with using format in generate_from_raw requests. Newer vllm server version expect different parameter structures. I added some code to try to find the vllm version and then use that version in the request. Added tests to test the underlying function. We don't have a great way to test different vllm server versions, but I tested older and newer versions manually. Both worked.

@jakelorocco
Copy link
Contributor Author

@guicho271828, could you please review this when you get a chance. Thanks!

@jakelorocco jakelorocco force-pushed the jal/remove-vllm-adapters branch 3 times, most recently from 52eb26b to 6848fd9 Compare February 12, 2026 20:55
@jakelorocco jakelorocco requested a review from a team as a code owner February 16, 2026 13:33
@nrfulton
Copy link
Member

@guicho271828 context: the vLLM aLoRA PR will not accepted, so the alora/intrinsics code for openai is now all dead code.

openai backend no longer supports intrinsics for vllm; vllm branch to
support aloras was closed with prejudice.
@guicho271828 guicho271828 force-pushed the jal/remove-vllm-adapters branch from dce4862 to d6a9d62 Compare February 17, 2026 18:25
@guicho271828 guicho271828 force-pushed the jal/remove-vllm-adapters branch from 0206743 to 065f7f3 Compare February 17, 2026 21:11
@guicho271828
Copy link
Contributor

rewrote the tests; test pass. Since there is no need for using the forked vllm, it now simply launches a child vllm process.
Screenshot from 2026-02-17 16-15-07

@jakelorocco jakelorocco disabled auto-merge February 17, 2026 21:31
@jakelorocco jakelorocco merged commit 8316495 into main Feb 17, 2026
4 checks passed
@jakelorocco jakelorocco deleted the jal/remove-vllm-adapters branch February 17, 2026 21:33
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants

Comments