Merge pull request #849 from swyxio/patch-1

Update CONTRIBUTING.md
This commit is contained in:
Nicolas 2024-11-05 13:30:41 -05:00 committed by GitHub
commit 75b48dced3
No known key found for this signature in database
GPG Key ID: B5690EEEBB952194

View File

@ -6,7 +6,7 @@ If you're contributing, note that the process is similar to other open source re
## Running the project locally ## Running the project locally
First, start by installing dependencies First, start by installing dependencies:
1. node.js [instructions](https://nodejs.org/en/learn/getting-started/how-to-install-nodejs) 1. node.js [instructions](https://nodejs.org/en/learn/getting-started/how-to-install-nodejs)
2. pnpm [instructions](https://pnpm.io/installation) 2. pnpm [instructions](https://pnpm.io/installation)
@ -56,12 +56,13 @@ POSTHOG_HOST= # set if you'd like to send posthog events like job logs
First, install the dependencies using pnpm. First, install the dependencies using pnpm.
```bash ```bash
pnpm install # cd apps/api # to make sure you're in the right folder
pnpm install # make sure you have pnpm version 9+!
``` ```
### Running the project ### Running the project
You're going to need to open 3 terminals. You're going to need to open 3 terminals. Here is [a video guide accurate as of Oct 2024](https://youtu.be/LHqg5QNI4UY).
### Terminal 1 - setting up redis ### Terminal 1 - setting up redis
@ -77,6 +78,7 @@ Now, navigate to the apps/api/ directory and run:
```bash ```bash
pnpm run workers pnpm run workers
# if you are going to use the [llm-extract feature](https://github.com/mendableai/firecrawl/pull/586/), you should also export OPENAI_API_KEY=sk-______
``` ```
This will start the workers who are responsible for processing crawl jobs. This will start the workers who are responsible for processing crawl jobs.