Transform any website into structured data with just a few clicks! The Crawl4AI Assistant Chrome Extension provides two powerful tools for web scraping and automation.
Schema Builder
Click to select elements and build extraction schemas visually
Script Builder (Alpha)
Record browser actions to create automation scripts
Python Code
Get production-ready Crawl4AI code instantly
Beautiful UI
Draggable toolbar with macOS-style interface
Quick Start
Download the Extension
Get the latest release from GitHub or use the button below
Download Extension (v1.2.1)Load in Chrome
Navigate to chrome://extensions/ and enable Developer Mode
Load Unpacked
Click "Load unpacked" and select the extracted extension folder
Explore Our Tools
Schema Builder
Visual data extraction
Script Builder
Browser automation
📊 Schema Builder
Click to extract data visuallySelect Container
Click on any repeating element like product cards or articles
Mark Fields
Click on data fields inside the container
Generate & Extract
Get your CSS selectors and Python code instantly
🔴 Script Builder
Record actions, generate automationHit Record
Start capturing your browser interactions
Interact Naturally
Click, type, scroll - everything is captured
Export Script
Get JavaScript for Crawl4AI's js_code parameter
See the Generated Code
import asyncio
import json
from crawl4ai import AsyncWebCrawler, CrawlerRunConfig
from crawl4ai.extraction_strategy import JsonCssExtractionStrategy
async def extract_products():
# Schema generated from your visual selection
schema = {
"name": "Product Catalog",
"baseSelector": "div.product-card", # Container you clicked
"fields": [
{
"name": "title",
"selector": "h3.product-title",
"type": "text"
},
{
"name": "price",
"selector": "span.price",
"type": "text"
},
{
"name": "image",
"selector": "img.product-img",
"type": "attribute",
"attribute": "src"
}
]
}
config = CrawlerRunConfig(
extraction_strategy=JsonCssExtractionStrategy(schema)
)
async with AsyncWebCrawler() as crawler:
result = await crawler.arun(
url="https://example.com/products",
config=config
)
return json.loads(result.extracted_content)
asyncio.run(extract_products())
import asyncio
from crawl4ai import AsyncWebCrawler, CrawlerRunConfig
# JavaScript generated from your recorded actions
js_script = """
// Search for products
document.querySelector('button.search-toggle').click();
await new Promise(r => setTimeout(r, 500));
// Type search query
const searchInput = document.querySelector('input#search');
searchInput.value = 'wireless headphones';
searchInput.dispatchEvent(new Event('input', {bubbles: true}));
// Submit search
searchInput.dispatchEvent(new KeyboardEvent('keydown', {
key: 'Enter', keyCode: 13, bubbles: true
}));
// Wait for results
await new Promise(r => setTimeout(r, 2000));
// Click first product
document.querySelector('.product-item:first-child').click();
// Wait for product page
await new Promise(r => setTimeout(r, 1000));
// Add to cart
document.querySelector('button.add-to-cart').click();
"""
async def automate_shopping():
config = CrawlerRunConfig(
js_code=js_script,
wait_for="css:.cart-confirmation",
screenshot=True
)
async with AsyncWebCrawler() as crawler:
result = await crawler.arun(
url="https://shop.example.com",
config=config
)
print(f"✓ Automation complete: {result.url}")
return result
asyncio.run(automate_shopping())
Coming Soon: Even More Power
We're continuously expanding C4AI Assistant with powerful new features to make web scraping even easier:
Run on C4AI Cloud
Execute your extraction directly in the cloud without setting up any local environment. Just click "Run on Cloud" and get your data instantly.
☁️ Instant results • Auto-scaling
Get CrawlResult Without Code
Skip the code generation entirely! Get extracted data directly in the extension as a CrawlResult object, ready to download as JSON.
📊 One-click extraction • No Python needed • Export to JSON/CSV
Smart Schema Suggestions
AI-powered field detection that automatically suggests the most likely data fields on any page, making schema building even faster.
🤖 Auto-detect fields • Smart naming • Pattern recognition
🚀 Stay tuned for updates! Follow our GitHub for the latest releases.