Enhance Crawl4AI with new features and documentation
- Fix crawler text mode for improved performance; cover missing `srcset` and `data_srcset` attributes in image tags. - Introduced Managed Browsers for enhanced crawling experience. - Updated documentation for clearer navigation on configuration. - Changed 'text_only' to 'text_mode' in configuration and methods. - Improved performance and relevance in content filtering strategies.
This commit is contained in:
@@ -4,59 +4,67 @@ Configure proxy settings and enhance security features in Crawl4AI for reliable
|
||||
|
||||
## Basic Proxy Setup
|
||||
|
||||
Simple proxy configuration:
|
||||
Simple proxy configuration with `BrowserConfig`:
|
||||
|
||||
```python
|
||||
from crawl4ai.async_configs import BrowserConfig
|
||||
|
||||
# Using proxy URL
|
||||
async with AsyncWebCrawler(
|
||||
proxy="http://proxy.example.com:8080"
|
||||
) as crawler:
|
||||
browser_config = BrowserConfig(proxy="http://proxy.example.com:8080")
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
result = await crawler.arun(url="https://example.com")
|
||||
|
||||
# Using SOCKS proxy
|
||||
async with AsyncWebCrawler(
|
||||
proxy="socks5://proxy.example.com:1080"
|
||||
) as crawler:
|
||||
browser_config = BrowserConfig(proxy="socks5://proxy.example.com:1080")
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
result = await crawler.arun(url="https://example.com")
|
||||
```
|
||||
|
||||
## Authenticated Proxy
|
||||
|
||||
Use proxy with authentication:
|
||||
Use an authenticated proxy with `BrowserConfig`:
|
||||
|
||||
```python
|
||||
from crawl4ai.async_configs import BrowserConfig
|
||||
|
||||
proxy_config = {
|
||||
"server": "http://proxy.example.com:8080",
|
||||
"username": "user",
|
||||
"password": "pass"
|
||||
}
|
||||
|
||||
async with AsyncWebCrawler(proxy_config=proxy_config) as crawler:
|
||||
browser_config = BrowserConfig(proxy_config=proxy_config)
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
result = await crawler.arun(url="https://example.com")
|
||||
```
|
||||
|
||||
## Rotating Proxies
|
||||
|
||||
Example using a proxy rotation service:
|
||||
Example using a proxy rotation service and updating `BrowserConfig` dynamically:
|
||||
|
||||
```python
|
||||
from crawl4ai.async_configs import BrowserConfig
|
||||
|
||||
async def get_next_proxy():
|
||||
# Your proxy rotation logic here
|
||||
return {"server": "http://next.proxy.com:8080"}
|
||||
|
||||
async with AsyncWebCrawler() as crawler:
|
||||
browser_config = BrowserConfig()
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
# Update proxy for each request
|
||||
for url in urls:
|
||||
proxy = await get_next_proxy()
|
||||
crawler.update_proxy(proxy)
|
||||
result = await crawler.arun(url=url)
|
||||
browser_config.proxy_config = proxy
|
||||
result = await crawler.arun(url=url, config=browser_config)
|
||||
```
|
||||
|
||||
## Custom Headers
|
||||
|
||||
Add security-related headers:
|
||||
Add security-related headers via `BrowserConfig`:
|
||||
|
||||
```python
|
||||
from crawl4ai.async_configs import BrowserConfig
|
||||
|
||||
headers = {
|
||||
"X-Forwarded-For": "203.0.113.195",
|
||||
"Accept-Language": "en-US,en;q=0.9",
|
||||
@@ -64,21 +72,24 @@ headers = {
|
||||
"Pragma": "no-cache"
|
||||
}
|
||||
|
||||
async with AsyncWebCrawler(headers=headers) as crawler:
|
||||
browser_config = BrowserConfig(headers=headers)
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
result = await crawler.arun(url="https://example.com")
|
||||
```
|
||||
|
||||
## Combining with Magic Mode
|
||||
|
||||
For maximum protection, combine proxy with Magic Mode:
|
||||
For maximum protection, combine proxy with Magic Mode via `CrawlerRunConfig` and `BrowserConfig`:
|
||||
|
||||
```python
|
||||
async with AsyncWebCrawler(
|
||||
from crawl4ai.async_configs import BrowserConfig, CrawlerRunConfig
|
||||
|
||||
browser_config = BrowserConfig(
|
||||
proxy="http://proxy.example.com:8080",
|
||||
headers={"Accept-Language": "en-US"}
|
||||
) as crawler:
|
||||
result = await crawler.arun(
|
||||
url="https://example.com",
|
||||
magic=True # Enable all anti-detection features
|
||||
)
|
||||
```
|
||||
)
|
||||
crawler_config = CrawlerRunConfig(magic=True) # Enable all anti-detection features
|
||||
|
||||
async with AsyncWebCrawler(config=browser_config) as crawler:
|
||||
result = await crawler.arun(url="https://example.com", config=crawler_config)
|
||||
```
|
||||
|
||||
Reference in New Issue
Block a user