Blog
Latest articles, tutorials, and product updates

Self-hosting OpenAI GPT Models
OpenAI recently released GPT OSS, their first open-weight models designed for powerful reasoning and agentic tasks. With the gpt-oss-120b and gpt-oss-20b models now available under the permissive Apache 2.0 license, you can run your own production-grade AI without vendor lock-in and privacy. DollarDeploy makes deploying LLM models straightforward. In just a few clicks, you can have your own reasoning AI running on your server, with full control over costs, and with the full privacy of self-host
Latest Articles
Latest Releases
We are cooking up tons of new features. PHP support, more native services on the host, and an upcoming command line tool! π New features * Experimental PHP support. Allows you to build and natively host PHP apps, using PHP-FPM in a completely isolated environment. Please take a look at our example app, which you can deploy easily using DollarDeploy. * MariaDB service. We install and automatically configure MariaDB on your server. π Fixes and improvements * Improve Settings => Subscrip
We have new amazing features in this release, and we can't wait to share them with you! This release goes live 2 days before the top Nordic event, Slush 2025, happens. Are you coming to Slush? Can't wait to meet you in Helsinki, Finland on 19-20 November βοΈβοΈβοΈ π New features * Public API: We are releasing our public API today. Take a look here: https://docs.dollardeploy.com/api. Using the API, you can automate deployments from your CI/CD pipeline. Combine it with notifications and webhook
TLDR: This release is all about speed and control. Set up apps faster with host selection at creation, cancel stuck deployments instantly, and deploy modern Python apps with UV. Plus, we're launching GPT-OSS self-hosting templates that let you run OpenAI's reasoning models on your own hardware. π New * Host Selection During App Creation β No more back-and-forth configuration. When you create an app, you can now select which host it should run on right from the start. Your app is production-


