Join our Discord Server
Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).

How to install and Configure NVM on Mac OS

1 min read

nvm (Node Version Manager) is a tool that allows you to install and manage multiple versions of Node.js on your Mac. nvm is a version manager for node.js, designed to be installed per-user, and invoked per-shell. nvm works on any POSIX-compliant shell (sh, dash, ksh, zsh, bash), in particular on these platforms: unix, macOS, and windows WSL.

To install nvm on a Mac, you will need to follow these steps:

Install Homebrew

nvm is not available in the default package manager for Mac, so you will need to install Homebrew first. To do this, open a terminal window and run the following command:

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/master/install.sh)"

Install nvm

Once you have Homebrew installed, you can use it to install nvm by running the following command:

brew install nvm

Add nvm to your shell profile: To make nvm available every time you open a new terminal window, you will need to add the following line to your shell profile (e.g., ~/.bash_profile or ~/.zshrc):

source $(brew --prefix nvm)/nvm.sh

Install Node.js

Once nvm is installed, you can use it to install the latest version of Node.js by running the following command:

nvm install node

How to use a specific version of NodeJS

To use a specific version of Node.js with nvm, you will need to follow these steps:

List available Node.js versions

To see a list of all available Node.js versions that you can install with nvm, run the following command:

nvm ls-remote

Install the desired version

To install a specific version of Node.js, such as version 16, use the following command:

nvm install 16

Use the installed version

Once the desired version of Node.js is installed, you can use it by running the following command:

nvm use 16

Set the default version: If you want to use a specific version of Node.js by default, you can set it as the default version using the following command:

nvm alias default 16

Also Read: How to Build a Node.js application with Docker in 5 Minutes

Unlimited Learning at Your Fingertips

Want to stay up-to-date on the latest tech tips?

Keep Reading

  • Testcontainers and Playwright

    Testcontainers and Playwright

    Discover how Testcontainers-Playwright simplifies browser automation and testing without local Playwright installations. Learn about its features, limitations, compatibility, and usage with code examples.

    Read More

  • Docker and Wasm Containers – Better Together

    Docker and Wasm Containers – Better Together

    Learn how Docker Desktop and CLI both manages Linux containers and Wasm containers side by side.

    Read More

  • Ollama vs. vLLM: Choosing the Best Tool for AI Model Workflows

    Ollama vs. vLLM: Choosing the Best Tool for AI Model Workflows

    As AI models grow in size and complexity, tools like vLLM and Ollama have emerged to address different aspects of serving and interacting with large language models (LLMs). While vLLM focuses on high-performance inference for scalable AI deployments, Ollama simplifies local inference for developers and researchers. This blog takes a deep dive into their architectures, use cases, and performance, complete with code snippets…

    Read More

  • The Ultimate Guide to Top LLMs for 2024: Speed, Accuracy, and Value

    The Ultimate Guide to Top LLMs for 2024: Speed, Accuracy, and Value

    Introduction Large Language Models (LLMs) have revolutionized the field of artificial intelligence, enabling machines to understand, interpret, and generate human-like text with unprecedented accuracy. As we enter 2024, the landscape of LLMs continues to evolve at breakneck speed, with new models emerging regularly. In this comprehensive guide, we’ll explore the top-performing LLMs of 2024, highlighting…

    Read More

  • Cracking the Code: Estimating GPU Memory for Large Language Models

    Cracking the Code: Estimating GPU Memory for Large Language Models

    As AI enthusiasts and developers, we’ve all encountered the daunting task of deploying Large Language Models (LLMs). One crucial aspect of this process is estimating the GPU memory required to serve these massive models efficiently. Let’s explore the fascinating world of LLM deployment and explore how to calculate the GPU memory needed for your AI…

    Read More

Have Queries? Join https://launchpass.com/collabnix

Ajeet Raina Ajeet Singh Raina is a former Docker Captain, Community Leader and Distinguished Arm Ambassador. He is a founder of Collabnix blogging site and has authored more than 700+ blogs on Docker, Kubernetes and Cloud-Native Technology. He runs a community Slack of 9800+ members and discord server close to 2600+ members. You can follow him on Twitter(@ajeetsraina).
Join our Discord Server
Index