Skip to content
Change the repository type filter

All

    Repositories list

    • FlashInfer: Kernel Library for LLM Serving
      Cuda
      Apache License 2.0
      184000Updated Jan 21, 2025Jan 21, 2025
    • Mooncake

      Public
      Mooncake is the serving platform for Kimi, a leading LLM service provided by Moonshot AI.
      C++
      Apache License 2.0
      1402.4k214Updated Jan 21, 2025Jan 21, 2025
    • vllm

      Public
      A high-throughput and memory-efficient inference and serving engine for LLMs
      Python
      Apache License 2.0
      5.3k801Updated Jan 20, 2025Jan 20, 2025
    • A Flexible Framework for Experiencing Cutting-edge LLM Inference Optimizations
      Python
      Apache License 2.0
      47836261Updated Nov 14, 2024Nov 14, 2024