Trade EverythingJul 11
free markets are responsible for our prosperity. let’s build more of them.
Tarek MansourGithub creator @cocktailpeanut published a guide to running the language model Alpaca locally, which allows users to interact with a fine-tuned version of Facebook’s LLM LLaMa, which “behaves qualitatively similarly to OpenAI’s text-davinci-003.” The guide, called Dalai, consists of a series of javascript commands.
While the guide will be simple for anyone who knows JavaScript, Python, and other programming languages, it requires enough technical know-how that the majority of people (myself included) will get lost pretty quickly. Here, using Dalai as a foundation, I’ve put together an absolute beginner’s guide to running Alpaca locally, on your Mac.
Node.js is an open-source, JavaScript runtime environment that allows you to run JavaScript code outside of a web browser — in this case, Terminal. To install it, click here, follow the prompts on the site, and follow the Installer prompts.
After following the prompts, node.js will be installed. If you’re like me, you’ll try to get ahead of the guide and open node.js on your Mac, but that’s not how it works. Node.js is simply a way of executing code on your computer, and you’ll use this capability in Terminal now.
Using spotlight, open Terminal on your Mac. When it’s open, you’ll see a console that looks like Notepad.
Terminal lets you control your computer using code instead of clicking on icons or buttons. Here, you can type in commands to tell your computer what to do. Broadly, these can do all kinds of things, like finding files, running programs, or making your Mac do tasks automatically.
In Terminal, type the following command and hit return:
If it returns a node.js version like in the screenshot below, you’re good.
Now you’re ready to install the Alpaca model to your hard drive. Installing this model will allow you to interact with it using a web UI (“localhost”) which will access the model. More on this in Step 5. To install the Alpaca model, give Terminal this command and hit return:
This will take a minute or two, and your Terminal will look like this:
Once the model has been installed, give it this command, and hit return —
Then, click here (http://localhost:3000) to run localhost, on which you will be able to talk to Alpaca. Unlike using ChatGPT, running an LLM on localhost can provide additional security and privacy benefits, since your data remains on your computer. It can provide faster response times compared to accessing ChatGPT, which can be super slow.
Here’s what you should be seeing now, with an example response after I prompted it to describe an apple:
npx dalai serve
control+C
to stop it.Stanford’s blog post says the following:
Deploying an interactive demo for Alpaca also poses potential risks, such as more widely disseminating harmful content and lowering the barrier for spam, fraud, or disinformation. We have put into place two risk mitigation strategies. First, we have implemented a content filter using OpenAI’s content moderation API, which filters out harmful content as defined by OpenAI’s usage policies.
Anecdotally, running Alpaca locally seems pretty uncensored. However, it is also highly prone to repetition and hallucination. That said, you can fine tune it to behave better — and even act similarly to GPT — if you put in enough research and work.
---
@cocktailpeanut’s user-friendly guide is a preview of the potential of having the ability to run an LLM on local computers, at scale. Individuals running their own LLMs can change their weights, customize their training data, and essentially use the technology ‘their own way’. We are approaching a inflection point after which LLMs are a commodity, and anyone who wants to can personalize their own language model in total privacy, with no restrictions, to supercharge their productivity — or perhaps their delusions.
Brian Roemmele, who writes at multiplex, tweeted yesterday that he used Dalai, in part, to help him “install and operate a full ChatGPT knowledge set… fully trained on my local computer and it needs no Internet once installed.” He says there is “no censorship,” and that “this model is now in a live connect with all of my other AI systems and the results have been absolutely stunning.”
-Brandon Gorrell
0 free articles left