Real-time price intelligence, competitor tracking, and alerting for Noon.com β Saudi Arabia's leading e-commerce marketplace.
- π Price Tracking β Monitor SKU prices with historical trends
- πͺ Competitor Analysis β Compare seller pricing across the marketplace
- π Smart Alerts β Get notified on price drops, stock changes, and anomalies
- π Analytics Dashboard β Interactive charts and data visualization
- β‘ Daily Scraping β Automated data collection via Airflow DAGs
- π REST API β Full-featured API with bearer token auth
βββββββββββββββββββ βββββββββββββββββββ βββββββββββββββββββ
β Airflow DAG ββββββΆβ Noon Scraper ββββββΆβ ClickHouse β
β (3 AM UTC) β β (ScraperAPI) β β (Price History)β
βββββββββββββββββββ βββββββββββββββββββ ββββββββββ¬βββββββββ
β
βββββββββββββββββββ βββββββββββββββββββ β
β PostgreSQL ββββββΆβ FastAPI ββββββββββββββββ
β (Users/Products)β β :8096/api β
βββββββββββββββββββ ββββββββββ¬βββββββββ
β
ββββββββββΌβββββββββ
β React + Vite β
β :3001 β
βββββββββββββββββββ
| Component | Technology |
|---|---|
| API Framework | FastAPI 0.109 |
| Primary DB | PostgreSQL 16 (users, products, auth) |
| Analytics DB | ClickHouse (price history, time-series) |
| Scraping | ScraperAPI + BeautifulSoup4 |
| Orchestration | Apache Airflow |
| Validation | Pydantic v2 |
| Component | Technology |
|---|---|
| Framework | React 18 + TypeScript |
| Build Tool | Vite |
| Styling | Tailwind CSS |
| UI Components | Radix UI + shadcn/ui |
| Charts | Chart.js |
| State | Zustand + React Query |
| Testing | Vitest + Testing Library |
noon-e-commerce/
βββ api/ # FastAPI backend
β βββ main.py # API endpoints
β βββ models.py # Pydantic schemas
β βββ database.py # ClickHouse client
βββ frontend-ts/ # React TypeScript frontend
β βββ src/
β β βββ components/ # UI components (CompetitorTable, AlertFeed, etc.)
β β βββ hooks/ # Custom hooks (useAlertFeed, useProducts)
β β βββ services/ # API client
β β βββ types/ # TypeScript definitions
β βββ package.json
βββ docker/ # Docker configurations
βββ scripts/ # Utility scripts
βββ noon_scraper.py # Core scraping module
βββ noon_dag.py # Airflow DAG definition
βββ postgres_schema.sql # DB schema
βββ docs/ # Documentation
- Python 3.10+
- Node.js 18+
- ClickHouse server
- ScraperAPI account
git clone https://github.com/aghaPathan/noon-e-commerce.git
cd noon-e-commerce
# Configure environment
cp .env.example .env
# Edit .env with your credentials# Create virtual environment
python -m venv .venv
source .venv/bin/activate
# Install dependencies
pip install -r requirements.txt
# Run API server
cd api && uvicorn main:app --host 0.0.0.0 --port 8096cd frontend-ts
npm install
npm run dev # Starts on http://localhost:3001# Copy DAG to Airflow
cp noon_dag.py ~/airflow/dags/
# DAG runs daily at 3 AM UTC (6 AM KSA)| Variable | Description | Required |
|---|---|---|
SCRAPERAPI_KEY |
ScraperAPI authentication key | β |
CLICKHOUSE_HOST |
ClickHouse server hostname | β |
CLICKHOUSE_PORT |
ClickHouse native port (default: 9000) | β |
CLICKHOUSE_USER |
Database username | β |
CLICKHOUSE_PASSWORD |
Database password | β |
CLICKHOUSE_DB |
Database name (default: noon) |
β |
API_TOKEN |
Bearer token for API auth | β |
API_PORT |
API server port | β |
| Method | Endpoint | Description |
|---|---|---|
GET |
/api/products |
List all tracked products |
GET |
/api/products/{sku} |
Get product details |
GET |
/api/prices/{sku} |
Get price history |
GET |
/api/prices/{sku}/competitors |
Get competitor prices |
GET |
/api/alerts |
Get active alerts |
POST |
/api/alerts/acknowledge/{id} |
Acknowledge an alert |
GET |
/api/health |
Health check |
All endpoints require Authorization: Bearer <API_TOKEN> header.
# Backend tests
pytest --cov=api
# Frontend tests
cd frontend-ts
npm run test
npm run test:ui # Interactive UI| Time (UTC) | Time (KSA) | Action |
|---|---|---|
| 03:00 | 06:00 | Daily price scrape |
SKUs are configured in skus.txt (one per line).
- Credentials stored in
.env(gitignored) - API protected with bearer token authentication
- ClickHouse access restricted to localhost
- No PII collected β only public product data
- Fork the repository
- Create feature branch (
git checkout -b feature/amazing-feature) - Commit changes (
git commit -m 'Add amazing feature') - Push to branch (
git push origin feature/amazing-feature) - Open a Pull Request
MIT License β see LICENSE for details.
Agha Awais β @aghaPathan
Built for the KSA market πΈπ¦
All PRs are checked for:
- β Syntax (Python, JS, TS, YAML, JSON, Dockerfile, Shell)
- β Secrets (No hardcoded credentials)
- β Security (High-severity vulnerabilities)