Last time, we built a simple model to
predict next week’s TSA traffic using only previous TSA traffic numbers. We relied on a straightforward heuristic: using
last year’s numbers adjusted for YoY trend. This simple baseline model allows us to quickly get a model in production.
From there, we can track performance and gradually refine it to improve profitability.
Today, we take the next step: using the output from this model to make actual trades on Kalshi, a prediction market
platform where users can trade on the outcome of events. As a first step, we will
send the model results as an email for manual trading.
Manual trading
I like to begin any automated trading strategy by doing some manual trading. This
lets you pick up some intuition on how the market acts and can inform any automated
trading strategies we may use. For example, in this TSA traffic market, we can show
that TSA traffic is generally lower the first few days of the week. Do market
participants pick up on this as well? or do they think of things more Year over Year
and care more about the percentage change since last year to predict the final
week’s traffic. Manually trading for a few weeks gives us the opportunity to pick
out trends like this one.
Setting up our prediction system for manual trading will involve:
1. Set up AWS infrastructure to automate prediction notifications
2. Create process to scrape recent TSA data
3. Generate predictions for next week
4. Email results to make manual buy-sell decisions
By the end of this post, you’ll have a practical understanding of how to leverage your predictive model to potentially
generate profits on Kalshi.
Note: I link to Kalshi with my referral link. We each get $25 if you sign up and make a certain number of trades with
this link.
Architecture
Our process will consist of the following components
1. Lambda function to turn on/off our EC2 instance
This step is just a money-saving task. Our EC2 instance costs money whenver it’s
active. The purpose of this lambda function is to optimize the time when our EC2
instance is turned on. Since we only need our instance for <30 mins/day, it doesn’t
make sense to pay for it to be on for 24 hours. Here is a good post on how to
do this in depth
2. EventBridge handles scheduling of the Lambda function
Eventbridge
lets us set up a schedule to kick off our lambda function at the same time every day.
3. EC2 instance is where the code is executed
EC2 is an AWS service to provide on-demand compute. This is where most of our
work will take place.
4. Python scripts living on EC2 instance to scrape data and generate predictions
This is the python code we have been working on in the past few posts. We pull
our git repo down to our EC2 instance and are able to execute this code within
cron jobs at the specified intervals. This is how we scrape the TSA data daily
and generate new predictions.
5. AWS SES allows us to send prediction results to email
We don’t want to log in to our EC2 instance every morning and check a file to
see our most recent predictions. Instead, we simply use AWS SES to email us the
results every morning. That way, we can just check our email to see what our
algorithm expects for the next week.
Previously, we created some code to scrape TSA traffic data, consolidate it into a single dataset,
and save as a csv. Here, we want to take that code and set it up to run on a regular cadence
on our EC2 instance.
To automate this process on our recently setup EC2 instance, I’m going to create
the above code in a python script and kick it off every morning with a
cron job.
This is an easy way to kick off this python script every day at the same time. This
will write the TSA traffic data results to our EC2 instance in a directory that
can be accessed for later use.
Generate predictions for next week’s TSA traffic
Now that we have the most TSA traffic data, we can generate next week’s predictions.
We have already written this code in our previous post.
Email results
Finally, we need to send this final prediction to our personal email. This will
make it easier to quickly consult the prediction and make our manual trades without
requiring to log in to an EC2 instance first thing in the morning.
Conclusion
In today’s post, we should how to automate our simplified model process and send
the results through email using common AWS services.
Over the next few weeks, I will use these results to place trades on the Kalshi
platform with the hope of identifying opportunities to automate some trading
rules.
The Series
Now, that the introduction is out of the way, let’s get started. Below
are the different blog posts that are part of this series.
Please reach out if you have any feedback or want to chat.