chore: Update README.md and project structure
This commit is contained in:
25
README.md
25
README.md
@@ -29,10 +29,19 @@ There are two ways to use Crawl4AI: as a library in your Python projects or as a
|
|||||||
To install Crawl4AI as a library, follow these steps:
|
To install Crawl4AI as a library, follow these steps:
|
||||||
|
|
||||||
1. Install the package from GitHub:
|
1. Install the package from GitHub:
|
||||||
```
|
```sh
|
||||||
pip install git+https://github.com/unclecode/crawl4ai.git
|
pip install git+https://github.com/unclecode/crawl4ai.git
|
||||||
```
|
```
|
||||||
|
|
||||||
|
Alternatively, you can clone the repository and install the package locally:
|
||||||
|
```sh
|
||||||
|
virtualenv venv
|
||||||
|
source venv/bin/activate
|
||||||
|
git clone https://github.com/unclecode/crawl4ai.git
|
||||||
|
cd crawl4ai
|
||||||
|
pip install -e .
|
||||||
|
```
|
||||||
|
|
||||||
2. Import the necessary modules in your Python script:
|
2. Import the necessary modules in your Python script:
|
||||||
```python
|
```python
|
||||||
from crawl4ai.web_crawler import WebCrawler
|
from crawl4ai.web_crawler import WebCrawler
|
||||||
@@ -46,28 +55,28 @@ from crawl4ai.models import UrlModel
|
|||||||
To run Crawl4AI as a standalone local server, follow these steps:
|
To run Crawl4AI as a standalone local server, follow these steps:
|
||||||
|
|
||||||
1. Clone the repository:
|
1. Clone the repository:
|
||||||
```
|
```sh
|
||||||
git clone https://github.com/unclecode/crawl4ai.git
|
git clone https://github.com/unclecode/crawl4ai.git
|
||||||
```
|
```
|
||||||
|
|
||||||
2. Navigate to the project directory:
|
2. Navigate to the project directory:
|
||||||
```
|
```sh
|
||||||
cd crawl4ai
|
cd crawl4ai
|
||||||
```
|
```
|
||||||
|
|
||||||
3. Open `crawler/config.py` and set your favorite LLM provider and API token.
|
3. Open `crawler/config.py` and set your favorite LLM provider and API token.
|
||||||
|
|
||||||
4. Build the Docker image:
|
4. Build the Docker image:
|
||||||
```
|
```sh
|
||||||
docker build -t crawl4ai .
|
docker build -t crawl4ai .
|
||||||
```
|
```
|
||||||
For Mac users, use the following command instead:
|
For Mac users, use the following command instead:
|
||||||
```
|
```sh
|
||||||
docker build --platform linux/amd64 -t crawl4ai .
|
docker build --platform linux/amd64 -t crawl4ai .
|
||||||
```
|
```
|
||||||
|
|
||||||
5. Run the Docker container:
|
5. Run the Docker container:
|
||||||
```
|
```sh
|
||||||
docker run -d -p 8000:80 crawl4ai
|
docker run -d -p 8000:80 crawl4ai
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -82,7 +91,7 @@ Choose the approach that best suits your needs. If you want to integrate Crawl4A
|
|||||||
Here's an example of how to use Crawl4AI with Python to crawl a webpage and retrieve the extracted data:
|
Here's an example of how to use Crawl4AI with Python to crawl a webpage and retrieve the extracted data:
|
||||||
|
|
||||||
1. Make sure you have the `requests` library installed. You can install it using pip:
|
1. Make sure you have the `requests` library installed. You can install it using pip:
|
||||||
```
|
```sh
|
||||||
pip install requests
|
pip install requests
|
||||||
```
|
```
|
||||||
|
|
||||||
@@ -129,7 +138,7 @@ Make sure to replace `"http://localhost:8000/crawl"` with the appropriate server
|
|||||||
You can also use Crawl4AI as a Python library in your own projects. Here's an example of how to use the Crawl4AI library:
|
You can also use Crawl4AI as a Python library in your own projects. Here's an example of how to use the Crawl4AI library:
|
||||||
|
|
||||||
1. Install the required dependencies:
|
1. Install the required dependencies:
|
||||||
```
|
```sh
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
|
|||||||
Reference in New Issue
Block a user