Skip to content

Actions: intel/ipex-llm

All workflows

Actions

Loading...
Loading

Showing runs from all workflows
19,089 workflow runs
19,089 workflow runs

Filter by Event

Filter by Status

Filter by Branch

Filter by Actor

[NPU] Fix of c++ convert example
Python Style Check #19428: Pull request #12797 opened by plusbang
February 10, 2025 03:11 18s plusbang:convert-small-fix
February 10, 2025 03:11 18s
Scorecard supply-chain security
Scorecard supply-chain security #392: Scheduled
February 10, 2025 02:42 10s main
February 10, 2025 02:42 10s
Upgrade to vLLM 0.6.6
Python Style Check #19427: Pull request #12796 opened by gc-fu
February 10, 2025 02:14 18s gc-fu:upgrade-vllm-0.6.6
February 10, 2025 02:14 18s
CodeQL
CodeQL #45: Scheduled
February 9, 2025 14:28 11s main
February 9, 2025 14:28 11s
Scorecard supply-chain security
Scorecard supply-chain security #391: Scheduled
February 9, 2025 02:42 9s main
February 9, 2025 02:42 9s
Scorecard supply-chain security
Scorecard supply-chain security #390: Scheduled
February 8, 2025 02:40 9s main
February 8, 2025 02:40 9s
Rename NPU public example to llm-cli (#12790)
Python Style Check #19426: Commit 468d3f2 pushed by hkvision
February 8, 2025 02:20 19s main
February 8, 2025 02:20 19s
Rename NPU public example to llm-cli
Python Style Check #19425: Pull request #12790 synchronize by hkvision
February 8, 2025 02:10 18s hkvision:llm-cli
February 8, 2025 02:10 18s
[NPU] Support non-const parameter for decoder layers when keep_ir=Tru…
Python Style Check #19424: Commit e90a9ad pushed by rnwang04
February 8, 2025 01:58 16s main
February 8, 2025 01:58 16s
update more lora example (#12785)
Python Style Check #19423: Commit 8aea531 pushed by MeouSker77
February 8, 2025 01:46 22s main
February 8, 2025 01:46 22s
Rename NPU public example to llm-cli
Python Style Check #19422: Pull request #12790 opened by hkvision
February 7, 2025 12:11 17s hkvision:llm-cli
February 7, 2025 12:11 17s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19421: Pull request #12789 synchronize by rnwang04
February 7, 2025 11:06 21s rnwang04:ln_const
February 7, 2025 11:06 21s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19420: Pull request #12789 synchronize by rnwang04
February 7, 2025 11:04 18s rnwang04:ln_const
February 7, 2025 11:04 18s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19419: Pull request #12789 synchronize by rnwang04
February 7, 2025 11:01 22s rnwang04:ln_const
February 7, 2025 11:01 22s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19418: Pull request #12789 synchronize by rnwang04
February 7, 2025 10:58 18s rnwang04:ln_const
February 7, 2025 10:58 18s
Upgrade ipex-llm[cpp] to oneAPI 2025.0 on Windows (#12778)
Python Style Check #19417: Commit fd28cf1 pushed by Oscilloscope98
February 7, 2025 10:29 27s main
February 7, 2025 10:29 27s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19416: Pull request #12789 synchronize by rnwang04
February 7, 2025 10:02 16s rnwang04:ln_const
February 7, 2025 10:02 16s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19415: Pull request #12789 synchronize by rnwang04
February 7, 2025 10:00 22s rnwang04:ln_const
February 7, 2025 10:00 22s
Upgrade ipex-llm[cpp] to oneAPI 2025.0 on Windows
Python Style Check #19414: Pull request #12778 synchronize by Oscilloscope98
February 7, 2025 09:55 19s llama-cpp-oneapi2025_0-update
February 7, 2025 09:55 19s
[NPU] Support non-const parameter for decoder layers when keep_ir=True
Python Style Check #19413: Pull request #12789 opened by rnwang04
February 7, 2025 09:24 17s rnwang04:ln_const
February 7, 2025 09:24 17s
[NPU] Support qwen models with cos_sin_input=True (#12788)
Python Style Check #19412: Commit ca1d7b7 pushed by plusbang
February 7, 2025 08:41 18s main
February 7, 2025 08:41 18s
[NPU] Support qwen models with cos_sin_input=True
Python Style Check #19411: Pull request #12788 synchronize by plusbang
February 7, 2025 08:24 24s plusbang:qwen2-cos-sin-input
February 7, 2025 08:24 24s
Upgrade ipex-llm[cpp] to oneAPI 2025.0 on Windows
Python Style Check #19410: Pull request #12778 synchronize by Oscilloscope98
February 7, 2025 08:12 17s llama-cpp-oneapi2025_0-update
February 7, 2025 08:12 17s
Upgrade ipex-llm[cpp] to oneAPI 2025.0 on Windows
Python Style Check #19409: Pull request #12778 synchronize by Oscilloscope98
February 7, 2025 08:07 18s llama-cpp-oneapi2025_0-update
February 7, 2025 08:07 18s
[NPU] Support qwen models with cos_sin_input=True
Python Style Check #19408: Pull request #12788 synchronize by plusbang
February 7, 2025 07:26 20s plusbang:qwen2-cos-sin-input
February 7, 2025 07:26 20s