
Drones Are Speeding Hurricane Harvey Response - rpark
http://3dinsider.com/drones-hurricane-harvey/
======
ortusdux
I've said in the past, over a few beers, that instead of just switching the
doodle over, google should integrate disaster images and video into google
maps and allow people to see the damage first hand. The effect of spatializing
this information is truly profound. I did not understand katrina until I
visited the 9th ward and saw the damage myself.

I've also said over beers that we need a system to aggregate disaster
information. In a situation where we have 100+ fires and 1000+ flood related
rescues, sifting through the noise is a nearly impossible task.

~~~
cx1000
Real time drone control from your browser could have a similar effect, and
even help find people who are stranded: [http://cape.com](http://cape.com)

~~~
meri_dian
Can't wait to try that out when I get back to my laptop.

------
Keyframe
Drones are useful in most unexpected areas. I've recently been involved in a
project that is using drones in order to identify possible sites of missing
persons after armed conflicts. Missing, unfortunately, here means drones are
used to identify possible locations of graves and especially mass graves.

Challenge is that using this tech approach is severely under-funded for now,
until it yields (more) results when it can then be shown to relevant places
for funding (military and civil sources). It's not my project, I'm somewhat
related to it, but I'm looking to connect with drone makers as well as multi-
spectral imaging sensor makers which could participate in the project. Project
is located in Croatia, but is not limited to it, of course. It's well-suited,
unfortunately, for this project though. If anyone knows something, or is a
drone or imaging sensor maker, and is interested, shoot me an email - it's in
my profile page.

------
kozikow
At tensorflight.com we are building deep learning model for hurricane Harvey
damage segmentation.

We already have 12K instances of partial building damage and 400 of total
damage annotated and preliminary segmentation model is just being trained on 4
K80s. If you want to collaborate in any way please let me know at
kozikow@tensorflight.com

~~~
rubyfan
Very interested in this work. What imagery source is even available this
quick?

~~~
maxerickson
NOAA has some post storm imagery:

[https://storms.ngs.noaa.gov/storms/harvey/index.html](https://storms.ngs.noaa.gov/storms/harvey/index.html)

------
josephpmay
I love how they invent an idealized scenario to come up with the 800% number.
Good marketing, I guess.

~~~
dbuxton
Well, it's also 700% in their specific scenario (2x speedup = 100% etc) so not
even an accurate invented story...

------
yourapostasy
My SO helped with the crowdsourced civilian effort to dispatch water rescues
of stranded people. The effort is wound down now; it will likely be hailed as
a coup of "social media", but that's overhyping the social media aspect.
Social media got the word out of where to go online to enter the coordination,
but had little to do with the actual coordination itself. The group my SO
worked with coordinated mostly through the Zello mobile app, Glympse mobile
app, houstonharveyrescue.com (going dark soon, due to PII concerns), and a
Google Sheets to track water rescue requests.

TFA is mainly discussing use of drones in the recovery phase, and only
tangentially touched upon drones for rescues. Drones were not used much for
long-range water rescues, because they were a hazard for the many volunteer
helicopters that responded.

This pointed out the need for a solution (preferably as automated as possible)
that allocates helicopter and drone flight paths in a disaster area.

The whole experience was very eye-opening for me. There isn't a good solution
for coordinating disaster response by civilians, but even just the _ad hoc_
quick-and-dirty collection of apps used by the various civilian groups that
responded showed how much leverage Internet-enabled coordination delivered.
The latency of civilian response is much lower than government response, but
once the government landed resources, the government response had much greater
volume. Mix both groups at the right times, and you'd have an admirable
disaster response, pretty much what happened in Houston.

Observations from listening in on my SO during meal times (the only times I
could break from work):

* Misinformation is rife. This is a difficult problem to address. Example: rumor starts that a rescuer was slashed with a machete. Story morphs into shot and slashed, then slashed-got-sepsis. Turns out a guy stepped on glass and got a nasty gash.

* No good solution to map rapidly-changing road conditions. Piles of rescuers with valuable boats in the first critical hours of response were diverted to drive around to find a way into the right areas of Houston to deploy. Need a way to effectively intake reports from people with just trucks (lots of citizens responding with no boats wanted to help in some way), snapping pictures at a specific location, giving location and time, and reporting road closures due to specified height of water, electrical line, _etc._ Bonus for AR-enabled measurement of water depth, based upon baseline measurement of vehicle. Extra bonus for measuring water speed by tossing a recognized object into the water and tracking it. Then people who pull up the heat map of closures will flood-fill (pardon the pun) out possible routes, avoiding lots of redundant checks of possible routes. A lot of valuable time was wasted on this, the first few hours were filled with civilians an hour from arriving at the area (as instructed over social media) calling in and asking how they can reach where they can drop their boats, because the main routes were all closed.

* No good solution to map flooded areas, how deep, and forecasted levels. People pieced it together by hand and passing along the grapevine. Depth matters: below a certain level, outboards were getting stuck. Below a different level, and all boats had to watch for fences they could get snagged on (had a few that capsized on such obstacles). Ideal: remote-reporting gauges scattered in a grid pattern throughout the area, or gauges that can be dropped down during the initial rescue efforts, and reclaimed later.

* We reached out to Uber and Lyft. IMHO, this was a PR coup sitting around for the taking. You have a system that optimizes for efficiently tracking and queuing requests, matching requests to vehicle capacity, directing the closest vehicle to the request, and showing requesters the live status. This was precisely what the water rescue coordination needed. Uber gave a canned "we're standing down for the safety of our drivers, for those who are outside of the areas of Houston that kicked us out that we can still operate in". Lyft said great idea, but the conversation black holed after that.

* Any app-based solution will have to be very sensitive to energy usage. Rescue requesters ran out of power on their phones distressingly often. Zello was established early on as a bad way to communicate with requesters; it drained batteries very quickly. Instead, requesters reached out to relatives/friends they knew who were safe, instructed them how to get Zello and get on the rescue channels, then put in a request, and then those relatives/friends would periodically query for a status update on the request. Use strongest WiFi if available, fall back to cell data (lowest-tech with strongest signal available), then SMS, then voice.

* These status update requests (see previous entry) took up a lot of bandwidth at the height of rescues, and added to the stress on the rescuers. A queuing system that operated over WiFi if available, then over cell data if not, telling requesters they are number N in line for the nearest rescuer, would have made the coordinating a lot easier.

* A unique ID was eventually established for assigning each request. A voice recognition system could easily listen in on a group and automatically assemble in timeline form all conversations that mention a particular ID, so anyone looking at a particular rescue request could see all historical discussion about that request.

* The Zello conversations quickly got unwieldy when there were too many people vying for "the microphone". Fortunately, people figured out how to manage this somewhat, splitting into Port Arthur and Houston-specific channels, for example. An app to auto-split by role (dispatcher, rescuer, requests, _etc._ ) and density-based geography (bounded by neighborhood boundaries, perhaps) would have helped some of the confusion.

* A voice recognition system to simply assign people to the right channel based upon their initial request would be helpful. There was an opportunity here for someone like Twilio to set up a single phone number that did this. In the first few hours, people did this through their personal lines: "Call me at xxx-xxx-xxxx when you are an hour out on I-10 from Conroe to get the current rally point." Then you hear later: "I'm an hour out, called xxx-xxx-xxxx number as instructed, and it's been busy for the last 15 minutes, what else can I do to find the current rally point?"

* Most useful feature of Zello: historical recording of every single transmission while you were listening. This let people go through them and follow up on water rescue requests, then mark them safe if they were rescued. This was a big problem at first: rescuers were pulling up a map on the web app, rushing to a request, then getting disappointed when they find out the rescue request was long since taken care of by another rescuer. Zello could improve on this: the historical recording was only for the duration you were listening; a feature (even paid) that pulled last N minutes/hours from their servers would be even better. Even better is a solution that tracks a rescuer to a rescue request, then presents a simple confirmation screen (# of adults, elderly, children, disabled, babies, pets rescued, any variation from pre-arranged drop-off point, any voice notes required), and auto-marks a request.

* Need a solution that maps water conditions at a specific location, ideally with tagged input of submitter, time, audio/picture/video, and NMEA data. At one point, the flat-bottomed boats were having a lot of trouble navigating choppy waters as Harvey came back in and churned up the flooded areas. Fold in with weather data, and predictively age out the conditions if possible, displaying that the computer model _thinks_ conditions might be so-and-so but be careful because it could still be the reported condition, until another submission confirms calmer conditions.

* People _REALLY love to help_. But if their efforts go unappreciated, or go to waste (about the same as unappreciated), they will get dejected _very_ quickly. This is why precise, comprehensive coordination is so critical to manage.

* There was initial concern about fake rescuers. This concern should not be dismissed, but as far as we could tell, this didn't happen.

* Need a solution that maps shelter facilities / government resources as they come online, capacity, and current utilization, so rescuers can efficiently forward rescuees to the best available facilities, most of whom are somewhat in a state of shock. Many shelter facilities early on were just school gyms, churches/temples/mosques, warehouses, and retail stores.

* A large number of private helicopters volunteered early. Knowing the most urgently medical-critical requests to prioritize was all manually performed.

* I suspect that the _ad hoc_ , thrown-together approach of apps to coordinate the rescues is close to their scalability limit. About 10K rescues were logged, in round numbers. I don't think the same approach will work beyond 3-50K rescues, because bottlenecks were becoming apparent to me even with what we had.

* Best part of Internet-enabled rescue coordination: anyone now has a choice to actively participate in the rescue no matter where in the world they are. That's incredibly powerful and a game-changer.

That's all I have off the top of my head.

All in all, I'm quite impressed how well this went, despite the difficulties
and setbacks I saw, and it shows some of the best humanity has to offer. There
are some really interesting, deep CS and software engineering problems to
solve in disaster response management and coordination.

Special shame to Joel Osteen: after his weak response compared to the area
churches and mosques with far less funds who threw their doors open within the
first hours after Harvey hit, he should be ostracized, as if "prosperity
ministry" wasn't bad enough on its own. Didn't know who this bloke was before
Harvey, other than "some guy who runs a megachurch", but after reading the
news stories, I can't believe he convinces so many parishioners into following
him.

~~~
thebiglebrewski
Awesome post! Something tells me a non-profit startup is in the works!

~~~
enjeyw
Ooooo oooo pick me! We're doing something in this space. Very much early days
but we DO have seed funding :)

Https://sempo.ai if you're interested

~~~
thebiglebrewski
Why do you need "AI" to solve this problem?

~~~
enjeyw
Yeah good question - we're particularly interested in filtering, categorising
and verifying the social media data that published during a disaster - the
volume of this data is absolutely huge, so if you can use some sort
classification algorithm (even basic stuff like SVM), then you can potentially
massively reduce the workload of responders, who just don't have the time to
that sort of manual work themselves - they're too busy doing the actual
response.

~~~
thebiglebrewski
Is that really an AI though or just a bunch of if-elsif-else loops? Is a true
artificial intelligence really required?!

~~~
enjeyw
If-elif loops definitely get you part of the way there;

Lets say we're trying to find the tweets that actually come from people affect
by a disaster, rather than just supplying commentary about them.

You could just say "if tweet is not within x distance of disaster, then tweet
is likely to be commentary rather than from someone actually affected".
However this could always miss a tweet like

"My elderly mother just called me, and she's in trouble but doesn't know who
to contact"

So really this is a case of "Given tweet is not within x distance, likelihood
of being from affected person is lower", but of course we don't know what the
affect on likelihood is (I couldn't even ballpark it for you).

If you had some classified data though, you could start to get and idea of
what all these Bayes Factors are. And just like that, you're doing a naive
bayesian classifier. As I said, not fancy "Deep learning with CNNs", but ML
nonetheless.

~~~
thebiglebrewski
I guess I see what you're saying.

...couldn't you classify the data with if/elsif/else statements though? And
store likelihoods of past records within the database?

Do you need to use a, "Bayesian Classifier"?

Is this all that Artificial Intelligence means now? I thought it was like, "a
machine that can pass the Turing test"? Sorry if I'm just naive.

------
derekjobst
The subtle scroll hijacking on this web page makes it almost unreadable.

------
mckoss
Huh? The article title makes a claim of 8x speed up in disaster recovery
through drone use, but the text of the article says the government has
restricted all civilian drone use for a month in the Houston area (to
deconflict with military aircraft). These two things don't seem to jibe.

~~~
brokentone
The insurance investigators appear to be commercially licensed.

------
microcolonel
Aside: This is why speedy regulation of drones will probably cost lives in
lost opportunity.

------
aaron695
Why would insurance companies want to speed up payments.

This would cost them millions?

~~~
brokentone
Moving from unrealized to realized cost faster, sure is worse to a quarter,
etc, but eventually it will all be realized. It is on their balance sheets in
some way regardless. The bigger picture is that human capital to assess all
the claims is quite expensive and having to spend 8x less on that is a benefit
to them.

