Define artificial intelligence models in Docker Compose applications

Docker Compose now allows you to declare artificial intelligence model dependencies directly in your application´s configuration, making model provisioning and life-cycle management seamless across local and cloud platforms.

Docker Compose enables developers to define artificial intelligence models as first-class components within their application stacks. By introducing the models top-level element in the Compose file, teams can specify model dependencies alongside traditional service definitions. This architecture ensures that the required models are provisioned and managed in tandem with application services, which streamlines deployment and fosters portability across various environments supporting the Compose specification, such as Docker Model Runner and certain cloud providers.

Prerequisites for leveraging this feature include Docker Compose version 2.38 or later and a platform compatible with Compose models. Compose models are defined under the models section, allowing developers to reference model images, set configuration parameters such as context size, and pass runtime flags to the inference engine. Services can bind to models via short or long syntax. The short syntax facilitates automatic generation of environment variables (e.g., LLM_URL), while the long syntax offers custom variable naming for additional flexibility. Examples provided cover declaring language models and embedding models, as well as specifying cloud-optimized configurations using labels.

Platform portability is a defining aspect of Compose models. When running locally with Docker Model Runner, the defined models are pulled, executed, and the appropriate environment variables are injected into containers. On cloud platforms that support Compose models, the same configuration file can trigger the use of managed artificial intelligence services, benefit from cloud-specific scaling, and utilize enhanced monitoring, logging, and model versioning. This approach supports seamless migration between local and cloud environments and helps teams manage model access, configuration, and updates consistently. Further documentation is available for deep dives into syntax, advanced options, and integration best practices.

65

Impact Score

Deepfake porn’s hidden victims

Nonconsensual sexual deepfakes are harming not only the people whose faces are inserted into explicit content, but also adult performers whose bodies and likenesses are repurposed without consent. As generative Artificial Intelligence tools spread, performers face growing psychological, legal, and financial risks with limited protection.

Deepfake porn and chatbot privacy breaches

Nonconsensual deepfake pornography is harming not only people whose faces are inserted into explicit media, but also adult creators whose bodies and likenesses are reused without permission. Generative Artificial Intelligence chatbots are also exposing private phone numbers, making personal information easier to retrieve and harder to control.

European Union Artificial Intelligence Act raises layered compliance demands for finance

Banks, insurers and financial intermediaries face a more complex compliance environment as the European Union Artificial Intelligence Act overlays existing financial regulation and the GDPR. Proposed changes in the Digital Omnibus Package may delay some obligations, but the core challenge remains managing overlapping rules, roles and regulators.

Contact Us

Got questions? Use the form to contact us.

Contact Form

Clicking next sends a verification code to your email. After verifying, you can enter your message.