Open Source Licenses
Last updated: March 27, 2026
OSI-Approved License Compliance
InferenceBench and all of its direct dependencies are released under licenses approved by the Open Source Initiative (OSI). We do not distribute or depend on code under non-OSI-approved licenses. Every dependency listed below has been verified against the OSI license list at opensource.org/licenses.
InferenceBench License
InferenceBench is open source software released under the MIT License (SPDX: MIT).
SPDX-License-Identifier: MIT
MIT License
Copyright (c) 2025-2026 InferenceBench Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the "Software"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.
REUSE Compliance
InferenceBench follows the REUSE Specification 3.0 (reuse.software) to ensure every file in the repository carries clear and machine-readable copyright and licensing information. Each source file includes an SPDX header, and a LICENSES/ directory at the repository root contains the full text of every license used across the project.
Third-Party Libraries
InferenceBench is built with the following open source libraries and frameworks. All licenses are OSI-approved.
| Library | License | SPDX | Description |
|---|---|---|---|
| Next.js | MIT | MIT | React framework for production |
| React | MIT | MIT | UI component library |
| Tailwind CSS | MIT | MIT | Utility-first CSS framework |
| Recharts | MIT | MIT | Charting library for React |
| Zustand | MIT | MIT | Lightweight state management |
| Prisma | Apache 2.0 | Apache-2.0 | Database ORM and toolkit |
| Lucide React | ISC | ISC | Icon library |
| TypeScript | Apache 2.0 | Apache-2.0 | Typed JavaScript superset |
| Zod | MIT | MIT | Schema validation library |
| Vitest | MIT | MIT | Unit testing framework |
Software Bill of Materials (SBOM)
For full dependency supply chain transparency, InferenceBench publishes a machine-readable Software Bill of Materials (SBOM) in both CycloneDX and SPDX formats. The SBOM is regenerated on every release and includes all direct and transitive dependencies, their versions, and their SPDX license identifiers.
The SBOM files are available in the repository under sbom/ and are attached as artifacts to each GitHub release. For a complete list of dependencies and their licenses, you may also refer to the package.json and package-lock.json files in our GitHub repository.
Export Control / EAR Classification
InferenceBench is classified as EAR99 under the U.S. Export Administration Regulations (EAR). As publicly available open source software that does not contain or implement controlled encryption algorithms beyond standard TLS for HTTPS transport, it is not subject to export licensing requirements under 15 CFR 734.3(b) and 734.7.
This software may be freely exported, re-exported, and transferred to all destinations and end-users, subject only to U.S. sanctions and embargoes administered by OFAC.
Contributor License Agreement (CLA)
All contributors to InferenceBench are required to sign a Contributor License Agreement (CLA) before their contributions can be merged. The CLA ensures that:
- Contributors confirm they have the right to submit their contributions under the project's MIT license.
- The project maintainers receive a perpetual, irrevocable license to use, modify, and redistribute contributions.
- Contributors retain copyright to their own contributions.
- Contributions do not introduce third-party code under incompatible licenses.
The CLA is administered via a GitHub bot. First-time contributors will be prompted to sign electronically when they open a pull request.
DMCA & Takedown Procedure
InferenceBench respects the intellectual property rights of others. If you believe that content on inferencebench.io infringes your copyright, you may submit a DMCA takedown notice to legal@inferencebench.io with the following information:
- Identification of the copyrighted work claimed to be infringed.
- Identification of the material that is claimed to be infringing and its location on the platform.
- Your contact information (name, address, telephone number, email).
- A statement that you have a good faith belief that the use is not authorized by the copyright owner.
- A statement, under penalty of perjury, that the information in the notice is accurate and that you are authorized to act on behalf of the copyright owner.
- Your physical or electronic signature.
We will respond to valid DMCA notices within 10 business days and will remove or disable access to the allegedly infringing material while the claim is investigated.
Trademark Usage Guidelines
InferenceBench displays names of AI models, GPUs, and cloud providers for informational and comparison purposes. All trademarks, registered trademarks, product names, and company names mentioned on this platform are the property of their respective owners. Specifically:
- GPU names (e.g., NVIDIA A100, AMD MI300X, Intel Gaudi) are trademarks of NVIDIA Corporation, Advanced Micro Devices, Inc., and Intel Corporation respectively.
- Model names (e.g., GPT-4, Claude, Llama, Gemini, Mistral) are trademarks of their respective creators (OpenAI, Anthropic, Meta Platforms, Google, Mistral AI, and others).
- Provider names (e.g., AWS, Azure, Google Cloud, Together AI, Replicate) are trademarks of their respective companies.
Use of these trademarks on InferenceBench does not imply endorsement, affiliation, or sponsorship. InferenceBench uses these names solely under nominative fair use to identify the products being compared and analyzed.
Model Data Attribution
Model specifications displayed on InferenceBench are sourced from official documentation and public model cards published by their respective creators. This includes but is not limited to parameter counts, architecture details, context lengths, and quantization information.
All model names and associated intellectual property belong to their respective creators and organizations.
GPU Data Attribution
GPU specifications are sourced from official vendor documentation. This includes memory capacity, memory bandwidth, compute performance (FLOPS), TDP, and architecture details.
All GPU product names and associated intellectual property belong to their respective manufacturers.
Provider Pricing Attribution
Pricing data is sourced from provider websites and APIs. Prices are subject to change without notice. InferenceBench makes reasonable efforts to keep pricing data current but does not guarantee accuracy at any given time.
All provider names, logos, and associated intellectual property belong to their respective companies.
Disclaimer
InferenceBench is not affiliated with, endorsed by, or sponsored by any GPU vendor, model creator, or cloud provider mentioned on this platform. All trademarks and registered trademarks are the property of their respective owners.