-
Notifications
You must be signed in to change notification settings - Fork 233
Pull requests: openvinotoolkit/openvino.genai
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
Bump actions/dependency-review-action from 4.6.0 to 4.7.0
category: GHA
CI based on Github actions
dependencies
Pull requests that update a dependency file
github_actions
Pull requests that update GitHub Actions code
#2185
opened May 9, 2025 by
dependabot
bot
Loading…
[Coverity] Fix null pointer dereferences
category: llm_bench
Label for tool/llm_bench folder
category: LLM samples
GenAI LLM samples
#2184
opened May 9, 2025 by
popovaan
Loading…
Benchmark add empty lora test
category: llm_bench
Label for tool/llm_bench folder
#2183
opened May 9, 2025 by
wenyi5608
Loading…
[llm_bench] fix vlm processing without image and add more supported models
category: llm_bench
Label for tool/llm_bench folder
#2182
opened May 9, 2025 by
eaidova
Loading…
Fix tokenizer with incorrect property
category: tokenizers
Tokenizer class or submodule update
#2177
opened May 8, 2025 by
WeldonWangwang
Loading…
Fix race cond. Move get_awaiting_requests method to base class
category: continuous batching
Continuous batching
category: prompt lookup
Prompt look-up decoding
category: speculative decoding
Speculative decoding
#2174
opened May 7, 2025 by
olpipi
Loading…
Reorder model compilation
category: continuous batching
Continuous batching
category: prompt lookup
Prompt look-up decoding
[DRAFT] poetry dependencies management
category: cmake / build
Cmake scripts
category: GHA
CI based on Github actions
category: WWB
PR changes WWB
no-match-files
#2159
opened May 5, 2025 by
mryzhov
Loading…
Switch VLM to ContinuousBatching by default.
category: continuous batching
Continuous batching
category: LLM
LLM pipeline (stateful, static)
category: visual language
Visual language pipeline
no-match-files
Switch NPU Whisper to ov::genai::WhisperStatefulImpl
category: NPU
NPU related topics
category: whisper
Whisper pipeline
[Test] Moe/test
category: continuous batching
Continuous batching
do_not_merge
do_not_review
no-match-files
#2118
opened Apr 27, 2025 by
peterchen-intel
•
Draft
[POC] Image generation multi-concurrency idea
category: continuous batching
Continuous batching
category: CPP API
Changes in GenAI C++ public headers
category: Image generation samples
GenAI Image generation samples
category: image generation
Image generation pipelines
category: Python API
Python API for GenAI
category: tokenizers
Tokenizer class or submodule update
no-match-files
WIP
#2113
opened Apr 25, 2025 by
dkalinowski
Loading…
LLM: release plugin once pipeline is removed and WA for GPU
category: continuous batching
Continuous batching
category: CPP API
Changes in GenAI C++ public headers
category: LLM
LLM pipeline (stateful, static)
category: tokenizers
Tokenizer class or submodule update
no-match-files
Add paired input into genai::Tokenizer
category: CPP API
Changes in GenAI C++ public headers
category: Python API
Python API for GenAI
category: tokenizers
Tokenizer class or submodule update
enhancement
New feature or request
Implement SnapKV
category: continuous batching
Continuous batching
category: CPP API
Changes in GenAI C++ public headers
category: Python API
Python API for GenAI
Enable chat template by default during WWB evaluation
category: WWB
PR changes WWB
#2051
opened Apr 14, 2025 by
nikita-savelyevv
•
Draft
[JS] Upgrade the js package versions to the upcoming releases
category: GHA
CI based on Github actions
category: LLM samples
GenAI LLM samples
no-match-files
#2045
opened Apr 11, 2025 by
Retribution98
•
Draft
Add attention mask for decoder whisper model
category: LLM samples
GenAI LLM samples
category: NPU
NPU related topics
category: whisper
Whisper pipeline
Previous Next
ProTip!
Type g p on any issue or pull request to go back to the pull request listing page.