MPT-30B: Raising the bar for open-source foundation models
By A Mystery Man Writer
Description
Introducing MPT-30B, a new, more powerful member of our Foundation Series of open-source models, trained with an 8k context length on NVIDIA H100 Tensor Core GPUs.
![MPT-30B: Raising the bar for open-source foundation models](https://miro.medium.com/v2/resize:fit:1364/1*pgS4uWCU8nDaqwh2jW7mSg.png)
Meet MPT-7B: The Game-Changing Open-Source/Commercially Viable Foundation Model from Mosaic ML, by Sriram Parthasarathy
llm-foundry/README.md at main · mosaicml/llm-foundry · GitHub
![MPT-30B: Raising the bar for open-source foundation models](https://assets-global.website-files.com/61fd4eb76a8d78bc0676b47d/6581d1bb55fe8cdee52a3c87_Stardog-SpotlightHeader-06.jpg)
Stardog: Customer Spotlight
Survival of the Fittest: Compact Generative AI Models Are the Future for Cost-Effective AI at Scale - Intel Community
![MPT-30B: Raising the bar for open-source foundation models](https://arxiv.org/html/2309.12307v2/x9.png)
LongLoRA: Efficient Fine-tuning of Long-Context Large Language Models
![MPT-30B: Raising the bar for open-source foundation models](https://neurips.cc/media/PosterPDFs/NeurIPS%202023/70239.png?t=1701438662.4135044)
NeurIPS 2023
![MPT-30B: Raising the bar for open-source foundation models](https://journal.code4lib.org/media/issue57/fitch/FitchX.png)
The Code4Lib Journal – Searching for Meaning Rather Than Keywords and Returning Answers Rather Than Links
![MPT-30B: Raising the bar for open-source foundation models](https://miro.medium.com/v2/resize:fit:1400/0*FukngUl82e5LW-jt.png)
Democratizing AI: MosaicML's Impact on the Open-Source LLM Movement, by Cameron R. Wolfe, Ph.D.
MosaicML Releases Open-Source MPT-30B LLMs, Trained on H100s to Power Generative AI Applications
Margaret Amori on LinkedIn: MPT-30B: Raising the bar for open-source foundation models
from
per adult (price varies by group size)