Chop chop chop!
It's a common feeling in New York City. I got woken up at 4:30 in the morning last night by some damn police helicopter circling like five feet above my apartment, you say. Do you know what it was? your friend asks, even though they know. You don't have any idea.
This bot aims to help solve that problem, at least a little bit, by:
- tracking where the NYPD's helicopters are flying -- with ADS-B
- figuring out when they're hovering -- with machine learning
- calculating the center point of the circles they make while flying -- with geometry
- figuring out what happened at that point around that time -- with machine learning
So far, #1, #2 and #3 are at least partially solved.
Etymologically, helicopter comes from helix + pteron meaning, well, "helix" and "wing" (like a pterodactyl). But someone reanalyzed it as heli + copter and now we have copters! Wow.
Cool. It's kind of involved but it's totally doable. I believe in you.
- You need to be able to receive ADSB signals for your area. You need to put a handful of Raspberry Pis with DVB-T receivers in areas with line of sight to most/all of your area. They don't have to all be in the same place. (We have receivers in far northern Manhattan, lower Manhattan and Brooklyn...)
- Build out a basemap in dump1090-mapper. In New York City, we use parks, airports and rivers/the bay as wayfinding guides. What to use here depends on your city; in Atlanta, I'd use the freeways. Pull requests accepted.
- Set up one MySQL database for all the receivers to write to.
- more to come...
Once we've assembled a decent-sized corpus of helicopter flights, we need figure out how to detect when a helicopter is hovering. We're going to do that with machine learning. In order to do that, we need to give the computer hand-picked examples of helicopters hovering and helicopters doing other non-hovering things. Here's how we do that.
- Run
ruby generate_images_for_hand_classification.rb
with the appropriate database env vars. This generates ahover_train_png
folder (and ahover_train_svg
folder in which you should run a webserver withpython -m http.server
lol sorry this is complicated) with PNGs representing 5 minute long segments of helicopter paths, along withshingles.csv
with metadata about each segment. (The segments overlap.) - Create the
hand_coded_training_data
folder and COPYhover_train_png
intohand_coded_training_data/hover_train_png_hover_only
(not move, copy). Then, leaf through the images and delete all the ones that do not depict hovering. Use your judgment. - With
generate_training_data_from_handclassified_shingles.rb
, generatetraining_data.csv
, which should soon include data about each shingle, plus features we generated. If you use additional..._hover_only
folders inhand_coded_training_data
be srue to record them in the Ruby script. - Then do some scikitlearn magic...
Experiment. Download from Google sheets Tweets by NYCFireWire. Run nycfirewire_parser.rb
. Upload the resulting NYCFireWire_tweets.csv
to the TAMU Batch geocoder. Download that to NYCFireWire_tw.csv
and you have points for that guy's tweets.
scp -r ../nypdcopterbot/*.rb ec2-user@whatever:/home/ec2-user/nypdcopterbot/
scp -r ../dump1090-mapper/*.js ec2-user@whatever:/home/ec2-user/dump1090-mapper/