Mastering Performance: How to Deploy a Self-Hosted Turbo Repo Remote Cache for Enhanced Project Builds

Discover the step-by-step guide on setting up a self-hosted Turbo Repo remote cache using Thibaut Marechal's solution. Boost your build times and team collaboration with this efficient caching mechanism.

Mastering Performance: How to Deploy a Self-Hosted Turbo Repo Remote Cache for Enhanced Project Builds

Turbocharging Your Development Workflow

In the bustling world of software development, efficiency isn't just a luxury—it's as essential as a morning croissant in Paris. Enter Turbo Repo, a nimble build system tailored for JavaScript and TypeScript monorepos that's as swift as a scooter weaving through the Marais district. While Turbo Repo natively supports various remote caching solutions, opting for a self-hosted remote cache is like having your own private café—complete control, no queue, and the coffee is always brewed to your liking. Let's embark on a journey to deploy your very own self-hosted Turbo Repo remote cache, using the resourceful open-source solution from ThibautMarechal's GitHub repository. So, tighten your beret and let's dive in, because just like the Paris Metro at rush hour, development waits for no one! (nah i'm kidding the beret is overhyped, who wears a beret in paris ? Only tourists...please don't do that 💀)

Why Self-Host a Turbo Repo Remote Cache?

A remote cache serves as a central hub where build artifacts are stored. This means subsequent builds can fetch these artifacts instead of rebuilding them, significantly cutting down build times and improving developer productivity. By hosting your own remote cache, you gain:

  • Full control over your data: Keep your sensitive information within your infrastructure.
  • Customized scaling options: Scale your caching needs according to your project size and team requirements.
  • Reduced external dependencies: Minimize downtime and latency issues associated with third-party services.

Last but not least, you get the Full Turbo. Isn't this beautiful ? 😎🤌🏾

Example of cached turbo build task

Step 1: Preparing Your Environment

Before you get started with your self-hosted Turbo Repo remote cache, ensure Docker is installed on your server. Docker provides a consistent environment for your cache, making it a bit more reliable. If Docker is not yet installed, you can find installation instructions for various platforms on the official Docker website.

Step 2: Deploying the compose file

Create a Project Directory: Start by creating a dedicated directory on your server for your project. This directory will house all necessary files, including your Docker Compose configuration. You can create it with a command like:

mkdir turborepo-cache
cd turborepo-cache

Prepare the Docker Compose File: Inside your new project directory, create a Docker Compose YAML file named docker-compose.yml. Paste the Docker Compose content provided earlier into this file. This content defines your PostgreSQL database and Turbo Repo remote cache service. Here's a simplified reminder of what to include:

version: '3'
    image: postgres:15.5
    restart: always
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DB: turborepo
      PGDATA: /var/lib/postgresql/data/pgdata
      - ./data/postgres/:/var/lib/postgresql/data
    image: thibmarechal/turborepo-remote-cache:1.13.0
      - db
      - "8080:8080"
      PORT: 8080
      STORAGE_TYPE: fs
      STORAGE_FS_PATH: /data
      COOKIE_NOT_SECURE: 'true'
      DATABASE_URL: postgres://postgres:postgres@db:5432/turborepo
      ADMIN_USERNAME: admin
      ADMIN_NAME: admin
      ADMIN_PASSWORD: change-me-with-a-long-ahh-password
      ADMIN_EMAIL: [email protected]
      - ./data/fs:/data

Configuration: All the configurations can be applied through the environment of the turborepo-remote-cache service.

  • PORT: The port on which the application will run.
  • STORAGE_TYPE: The type of storage to use (fs, s3, azure).
  • STORAGE_FS_PATH: The path where to store the cache when using file storage.
  • STORAGE_S3_ACCESS_KEY_ID: Access key ID for Amazon S3 storage.
  • STORAGE_S3_SECRET_ACCESS_KEY: Secret access key for Amazon S3 storage.
  • STORAGE_S3_FORCE_PATH_STYLE: Boolean to force path style for S3 requests.
  • STORAGE_S3_ENDPOINT: Endpoint URL for S3 storage.
  • STORAGE_S3_REGION: Region for S3 storage.
  • STORAGE_S3_SSL_ENABLED: Boolean to enable SSL for S3 storage.
  • STORAGE_S3_BUCKET: Bucket name for S3 storage.
  • STORAGE_AZURE_STORAGE_ACCOUNT: Account name for Azure Blob storage.
  • STORAGE_AZURE_STORAGE_ACCESS_KEY: Access key for Azure Blob storage.
  • STORAGE_AZURE_STORAGE_CONTAINER: Container name for Azure Blob storage.
  • DATABASE_URL: Connection URL for the Postgres database.
  • ADMIN_USERNAME: Username for the admin account.
  • ADMIN_NAME: Name for the admin account.
  • ADMIN_PASSWORD: Password for the admin account.
  • ADMIN_EMAIL: Email for the admin account.
  • OIDC: Boolean to enable OpenID Connect (OIDC) authentication.
  • OIDC_NAME: Name for the OIDC provider.
  • OIDC_AUTHORIZATION_URL: Authorization URL for the OIDC provider.
  • OIDC_TOKEN_URL: Token URL for the OIDC provider.
  • OIDC_CLIENT_ID: Client ID for the OIDC provider.
  • OIDC_CLIENT_SECRET: Client secret for the OIDC provider.
  • OIDC_PROFILE_URL: Profile URL for the OIDC provider.
  • AZURE_AD: Boolean to enable Azure Active Directory authentication.
  • AZURE_AD_CLIENT_ID: Client ID for Azure AD authentication.
  • AZURE_AD_CLIENT_SECRET: Client secret for Azure AD authentication.
  • AZURE_AD_TENANT_ID: Tenant ID for Azure AD authentication.
  • COOKIE_NOT_SECURE: false if serving over https otherwise true.

Launch the Services: With your docker-compose.yml ready, use Docker Compose to build and start your services. Execute the following command from within your project directory:

docker compose up -d

This command runs your containers in detached mode, meaning they'll continue running in the background.

Step 3: Integrating with Your Turbo Repo

Let's make your Turbo Repo projects use your new self-hosted cache. It's easy, just run this command at the root of your Turbo Repo

turbo login --login "https://[your-turbo-repo-link]/turbo/login" --api "https://[your-turbo-repo-link]/turbo/api"

This command log you in but the repo is not linked yet, to do so you have to run this command. You will be prompted to choose a team for the project.

turbo link

These commands also create a config file at .turbo/config.json. It is not advised to commit this file.

By doing so, your builds will know exactly where to drop off and pick up their cached artifacts, similar to how Parisians know their way around the métro - but way quicker AND never late compared to the métro.

Parisian metro and Turbo repo are the same but different

Your Build Process, Streamlined

Congratulations! Your Dockerized self-hosted Turbo Repo remote cache is now set up and running, akin to a well-oiled TGV on its way to the Marseille (but still faster). This setup not only optimizes your build times but does so with the elegance and efficiency that would make any developer puff out their chest like a proud Parisian pigeon. Enjoy the accelerated development cycle, enhanced control, and the delightful fact that your builds are now more reliable. Bonne continuation, and may your code compile smoothly.