‘Channels’ uses Django and extends its power beyond HTTP – to enable handling WebSockets, chat protocols, IoT protocols, and others. It is built on a Python specification called ASGI (Asynchronous Server Gateway Interface).
‘Channels’ is built on the native ASGI support provided in Django since v3.0, providing an implementation for Django v2.2. Django still handles the traditional HTTP protocol, whereas Channels provide you with the choice to handle other connections in a synchronous or asynchronous fashion.
In this article, we will build a simple project that scraps real-time stock prices from Yahoo using yahoo_fin Django package.
Celery is a distributed message-passing task/job queue. It is centered on real-time operation but also handles scheduling. The execution units, called jobs, are executed simultaneously on one or more job servers. Celery is being used in production in hundreds of projects to process millions of tasks per day.
How it works
Channels and ASGI allow the creation and the connection of clients (incoming connections) through producers to a scope (for example a group of chat) during which a set of events occurs (example: repeated tasks), these events are consumed by consumers (that create, modify or remove them) which send back a response via the broker (example (Reddis)) to the clients.
Queues created by Celery are persistent by default, which means even if the broker is disconnected, the tasks will be executed upon restart of the broker.
Here we’ve created a django project called core and an application called main.
We’ve installed redis, celery, channels, django_celery_beat using pip install command.
We will try to create two scopes using celery, the first to get the top-performing stocks and the second to get a particular stock data in real-time using WebSockets.
Configure Redis broker and the project’s applications (channels, django_celery_beat) –> official celery documentation.
core/main/tasks.py (please ignore additional imports)
- Make Migrations and migrate
- Collect statics
- Execute these specific commands:
- python manage.py runserver
- celery -A core worker -l INFO
- celery -A core beat -l INFO –scheduler django_celery_beat.schedulers:DatabaseScheduler
Note 1: in this tutorial, we have used yahoo_fi package to scrap data from Yahoo.fr in real-time, however, this package being not perfect, some stock data might not be downloaded correctly, you can easily use other APIs like IEX Cloud API.
Note 2: in this tutorial, instead of top-markets (most active) we have used get_top_crypto function instead to scrap the data of the most active cryptocurrencies of the day.