diff --git a/README.md b/README.md index bbba8e136a..016091186b 100644 --- a/README.md +++ b/README.md @@ -38,43 +38,21 @@ It streamlines development, training, and inference, and is compatible with any -### Installation +### Launch the server -> Before using `dstack` through CLI or API, set up a `dstack` server. If you already have a running `dstack` server, you only need to [set up the CLI](#set-up-the-cli). +> Before using `dstack` through CLI or API, set up a `dstack` server. If you already have a running `dstack` server, you only need to [install the CLI](#install-the-cli). -#### Set up the server - -##### Configure backends - -To orchestrate compute across GPU clouds or Kubernetes clusters, you need to configure backends. - -Backends can be set up in `~/.dstack/server/config.yml` or through the [project settings page](https://dstack.ai/docs/concepts/projects#backends) in the UI. - -For more details, see [Backends](https://dstack.ai/docs/concepts/backends). +To orchestrate compute across GPU clouds or Kubernetes clusters, you need to [configure backends](https://dstack.ai/docs/concepts/backends). > When using `dstack` with on-prem servers, backend configuration isn’t required. Simply create [SSH fleets](https://dstack.ai/docs/concepts/fleets#ssh-fleets) once the server is up. -##### Start the server - -You can install the server on Linux, macOS, and Windows (via WSL 2). It requires Git and +The server can be installed on Linux, macOS, and Windows (via WSL 2). It requires Git and OpenSSH. -##### uv - ```shell $ uv tool install "dstack[all]" -U -``` - -##### pip - -```shell -$ pip install "dstack[all]" -U -``` - -Once it's installed, go ahead and start the server. - -```shell $ dstack server + Applying ~/.dstack/server/config.yml... The admin token is "bbae0f28-d3dd-4820-bf61-8f4bb40815da" @@ -84,27 +62,18 @@ The server is running at http://127.0.0.1:3000/ > For more details on server configuration options, see the [Server deployment](https://dstack.ai/docs/guides/server-deployment) guide. +### Install the CLI -
Set up the CLI - -#### Set up the CLI +
If the CLI is not installed with the server -Once the server is up, you can access it via the `dstack` CLI. +Once the server is up, you can access it via the `dstack` CLI. The CLI can be installed on Linux, macOS, and Windows. It requires Git and OpenSSH. -##### uv - ```shell $ uv tool install dstack -U ``` -##### pip - -```shell -$ pip install dstack -U -``` - To point the CLI to the `dstack` server, configure it with the server address, user token, and project name: @@ -113,12 +82,22 @@ $ dstack project add \ --name main \ --url http://127.0.0.1:3000 \ --token bbae0f28-d3dd-4820-bf61-8f4bb40815da - + Configuration is updated at ~/.dstack/config.yml ```
+### Install AI agent skills + +Install [skills](https://skills.sh/dstackai/dstack/dstack) to help AI agents use the `dstack` CLI and edit configuration files. + +```shell +$ npx skills add dstackai/dstack +``` + +AI agents like Claude, Codex, and Cursor can now create and manage fleets and submit workloads on your behalf. + ### Define configurations `dstack` supports the following configurations: @@ -133,7 +112,7 @@ Configuration can be defined as YAML files within your repo. ### Apply configurations -Apply the configuration either via the `dstack apply` CLI command or through a programmatic API. +Apply the configuration via the `dstack apply` CLI command, a programmatic API, or through [AI agent skills](#install-ai-agent-skills). `dstack` automatically manages provisioning, job queuing, auto-scaling, networking, volumes, run failures, out-of-capacity errors, port-forwarding, and more — across clouds and on-prem clusters. diff --git a/docs/assets/javascripts/termynal.js b/docs/assets/javascripts/termynal.js index de778805c1..9a634c6a6d 100644 --- a/docs/assets/javascripts/termynal.js +++ b/docs/assets/javascripts/termynal.js @@ -119,6 +119,21 @@ class Termynal { line.removeAttribute(`${this.pfx}-cursor`); } + // Keep cursor visible if the last input has no output after it + let lastInputIdx = -1; + let hasOutputAfter = false; + for (let i = this.lines.length - 1; i >= 0; i--) { + if (this.lines[i].getAttribute(this.pfx) === 'input') { + lastInputIdx = i; + break; + } + if (this.lines[i].textContent.trim()) { + hasOutputAfter = true; + } + } + if (lastInputIdx >= 0 && !hasOutputAfter) { + this.lines[lastInputIdx].setAttribute(`${this.pfx}-cursor`, this.cursor); + } this.addRestart() this.finishElement.style.visibility = 'hidden' this.lineDelay = this.originalLineDelay diff --git a/docs/docs/installation.md b/docs/docs/installation.md index 83b1602b10..959cf89ce7 100644 --- a/docs/docs/installation.md +++ b/docs/docs/installation.md @@ -9,7 +9,7 @@ description: How to install the dstack server and CLI If you don't want to host the `dstack` server (or want to access GPU marketplace), skip installation and proceed to [dstack Sky](https://sky.dstack.ai). --> -## Set up the server +## Launch the server The server can run on your laptop or any environment with access to the cloud and on-prem clusters you plan to use. @@ -70,11 +70,11 @@ The server can run on your laptop or any environment with access to the cloud an For more details on server deployment options, see the [Server deployment](guides/server-deployment.md) guide. -### Configure backends +!!! info "Configure backends" -> To orchestrate compute across GPU clouds or Kubernetes clusters, you need to configure [backends](concepts/backends.md). + To orchestrate compute across GPU clouds or Kubernetes clusters, you need to configure [backends](concepts/backends.md). -## Set up the CLI +## Install the CLI Once the server is up, you can access it via the `dstack` CLI. @@ -175,7 +175,7 @@ Once the server is up, you can access it via the `dstack` CLI. > If you get an error similar to `2: command not found: compdef`, then add the following line to the beginning of your `~/.zshrc` file: > `autoload -Uz compinit && compinit`. -### Configure the default project +### Configure the project To point the CLI to the `dstack` server, configure it with the server address, user token, and project name: @@ -195,7 +195,7 @@ Configuration is updated at ~/.dstack/config.yml This configuration is stored in `~/.dstack/config.yml`. -### Install agent skills +## Install AI agent skills Install [skills](https://skills.sh/dstackai/dstack/dstack) to help AI agents use the `dstack` CLI and edit configuration files. @@ -207,7 +207,34 @@ $ npx skills add dstackai/dstack -Agent skills are experimental. Share your feedback via [GitHub issues](https://github.com/dstackai/dstack/issues). +AI agents like Claude, Codex, and Cursor can now create and manage fleets and submit workloads on your behalf. + +
+ +```shell + ▐▛███▜▌ Claude Code v2.1.83 +▝▜█████▛▘ Opus 4.6 (1M context) · Claude Team + ▘▘ ▝▝ ~/skills-demo + +$ /dstack + +dstack skill loaded. How can I help? For example: + + - Apply a configuration (*.dstack.yml) + - Check run status (dstack ps) + - Manage fleets, volumes, or services + - Create or edit a dstack configuration + - Troubleshoot provisioning or connectivity issues + + What would you like to do? + +$ +``` + +
+ +!!! info "Feedback" + We're actively improving Skills and would love your feedback in [GitHub issues](https://github.com/dstackai/dstack/issues). !!! info "What's next?" 1. See [Backends](concepts/backends.md)