Fully automating my stock trading portfolio through Zerodha Kite API, GCP Compute Engine and a Telegram bot(buy, sell and then buy again)
Mayank Jain4 min read·Just now--
It all started while I was exploring about momentum trading. I was filtering for top stocks with the highest absolute returns to build a test portfolio, but the “lazy developer” in me quickly got frustrated. Manually placing buy orders for multiple stocks felt tiring and cumbersome.
My curious mind decided to explore the Zerodha Kite API to automate the order execution. However, as a data professional, I was hungry for more. I didn’t just want to place orders; I wanted to automate the entire lifecycle: fetching historical data, calculating technical indicators, selecting the portfolio, and deploying it all on the cloud for 100% hands-off execution.
The Architecture: Building a Reliable Execution Engine
Before moving to the cloud, I perfected the logic on my local machine. The final architecture, now running on a GCP Compute Engine (e2-micro) in the us-central-1f region, is designed for reliability and minimum cost. It runs at a cost of INR 5 per trading day (or INR 100–150 per month).
The Ubuntu based system relies on two primary Linux services:
- Systemd Service: Keeps the Telegram Bot and the SL/TSL (Stop-Loss/Trailing Stop-Loss) monitoring services “Always On.”
- Cronjob Service: Triggers the daily data pipeline and the initial “Buy” trades at specific times
The Workflow: A Day in the Life of the Bot
The beauty of this system is that it handles the “heavy lifting” while keeping me in the loop for the most critical step: Consent, Buy updates and Sell updates.
1. The Morning Handshake (8:00 AM)
At 8 AM, the VM starts automatically via a GCP instance schedule. The Telegram bot sends me a prompt to login. Because brokers require daily manual consent for API trading, I click the link, login, and paste the request_token back into the bot.
2. Data Preparation & Selection (8:00 AM — 9:15 AM)
Once authenticated, the bot performs a “Data Sync,” updating OHLCV (Open, High, Low, Close, Volume) data for nearly 500 stocks. It then:
- Calculates indicators for buying, SL and Trailing SL buffers.
- Ranks stocks based on momentum.
- Prepares a Top 20 Portfolio based on my specific capital allocation rules.
3. Market Execution (9:15 AM — 3:30 PM)
At market open, the bot begins monitoring. It executes buys based on the prepared portfolio and starts the Monitoring Service.
- Stop-Loss (SL) & Trailing Stop-Loss (TSL): If a stock hits my exit criteria, the bot sells immediately.
- Dynamic Rebalancing: If capital is freed up by a sale, the bot scans for the next available “Top 20” candidate and buys again.
- Instant Alerts: Every trade — buy or sell — is pushed to my Telegram with the reason, quantity, and P&L.
4. Post-Market Shutdown (4:00 PM)
Once the market closes, the VM shuts down automatically to save costs. The bot sends a final confirmation that the instance is sleeping.
Technical Takeaways
Building this system provided a lot of learning on:
- Cloud Infrastructure: Setting up Linux-based VMs, handling external IP whitelisting for API security, and managing persistent disk storage for CSV/JSON ledgers.
- API Integration: Interfacing with the KITE API for real-time data and order placement.
- System Reliability: Using
systemdto ensure monitoring services never crash silently.
Final Thoughts
The objective was simple: build a system that follows a strategy without the interference of human emotion or manual delay. While this is currently a “Simple” deployment on a single VM, it serves as a foundation. The next step is extending this into a more robust system by integrating cloud databases or ML-based strategies for even more complex compute tasks.
If you have a strategy in place, the cloud is your best friend. It’s time to stop clicking and start coding!