Magical Crawler is a powerful web crawling tool designed to automate the process of collecting, filtering, and managing web data. Created with a blend of efficiency and versatility, this tool is ideal for applications requiring regular data collection, such as monitoring classified ads, gathering analytics data, or tracking updates across multiple sources. With a secure, high-performance codebase, Magical Crawler ensures reliable and consistent results.
- Golang - core language for development
- PostgreSQL - primary data storage for crawled content
- Gorm - ORM for database management
- Telegram Bot - for real-time notifications
- Telebot - a bot framework for Telegram Bot API
- Docker - for containerized deployment
- Testify - for testing and assertions
To get started with Magical Crawler:
-
Clone the Repository
git clone https://github.com/username/magical-crawler.git
-
Navigate to the Project Directory
cd magical-crawler
-
Install Dependencies Ensure you have Go installed, then install required packages:
go mod tidy
-
Configure Environment Set up your
.env
file with necessary configurations such as database connection details, Telegram bot credentials, and email settings. -
Start the Project with Docker
sudo docker-compose up --build
-
Access the API The API is accessible at
localhost:8080
by default.
Run tests to verify functionality and stability:
go test ./test