This summer, a dozen engineers, product designers, and policy leads from the Sidewalk and Flow teams took a break from their work developing new urban technology tools and held a two-day hackathon — developing urban technology tools. What can I say: they love cities.
The point of the sprint was simply to see what fun demos a bunch of bright minds could hack together in a brief window. But it produced three tools that could help people navigate their city in new ways, serve as a reference for urban planners, or inspire the civic tech community to pursue similar efforts. So we wanted to share the results with a wider audience.
Keep in mind, these are just one-off demos. But if you spot any problems, or just want to offer your take, we welcome responses below.
You, me, and a tree
Team: Corinna Li, David Paduano, Neha Rathi, John Wittrock
It’s still summer for a little longer, and when it’s not a thousand degrees outside, nothing beats meeting a friend at a nice, leafy park. But if you live in New York City it’s not always easy to find a spot that’s equally convenient for both of you. That’s where the “You, me, and a tree” demo comes in. The goal was to help two people find a centrally located park that isn’t necessarily … Central Park.
To use the app, first pick two starting locations. A transit router then figures out a park that takes both people roughly the same time to reach via walking and the subway. (As a proxy for travel fairness, the list of parks is ranked by the minimum sum of squares of the travel times. That approach favors two 10-minute trips, which have a sum of 200, over one 20-minute trip and one zero-minute trip, which have a sum of 400.) The app finds parks — and, more specifically, the optimal park entrance — based on a cache of freely available OpenStreetMap data.
So if you’re coming from Crown Heights and your friend is coming from Brooklyn Heights, you can meet at Grand Army Plaza by traveling roughly 10–12 minutes each:
The YM&T demo has its limitations. OSM defines a park in a way that leaves out some pretty obvious ones, as the team discovered when Highland Park on the Brooklyn-Queens border didn’t turn up during a spot check for data integrity. Currently the app is also confined to New York City’s five boroughs and doesn’t include other travel modes, such as buses or bikes. But it’s useful to think about new tools that can help people connect in the city — especially without a car.
Urban pattern typologies
Team: Jacob Baskin, Stephen Kennedy, Amy Kyleen Lute, Andrew Warren
Computer scientists build software using design patterns, or replicable solutions to common problems. But it turns out design patterns originally came from the world of architecture and urban planning. They were described in Christopher Alexander’s 1977 book, A Pattern Language. Alexander writes:
Each pattern describes a problem which occurs over and over again in our environment, and then describes the core of the solution to that problem, in such a way that you can use this solution a million times over, without ever doing it the same way twice.
While leafing through that book, the members of this hackathon team thought people designing urban spaces might benefit from being able to look at other examples in different cities very quickly. So they created a search engine for 10 international cities that could pull up two types of urban designs in the real world: public plazas and five-way intersections. Once again they relied on OpenStreetMap for data on these geographic designs.
Using the Urban Pattern Typology demo, a planner incorporating one of these street elements could quickly find examples from another city, read an overview of the design features, and see an aerial picture via Earth View:
This demo has limitations, too. The crew chose five-way intersections and public plazas because they could be identified reliably with OSM queries. But with more time and increasingly accurate geotags, it’s possible to imagine a program like this one showcasing a whole suite of street design archetypes: crosswalks, bike paths, green spaces, bus stops, promenades, interchanges, and more.
Counting cabs and blocked bike lanes
Team: Sven Kreiss, Willa Ng, Dan Vanderkam, Daniel Yehuda
The New York City Department of Transportation operates cameras around the city to identify congestion points in real time and help traffic move as smoothly as possible. Many traffic cams were added as part of the city’s award-winning Midtown in Motion system, implemented in 2011 by former DOT chief Janette Sadik-Khan, which improved travel time by 10 percent in the 110-block midtown area of Manhattan and reduced pollution. These camera feeds are available for viewing at the city’s official real-time signal site.
Traffic cameras can do more than detect gridlock when combined with other technology. This demo team applied computer vision to test the feasibility of counting yellow cabs in the feeds. Using background subtraction, they found they could distinguish fixed elements in the feeds, such as roads and storefronts, from moving vehicles. Then they further distinguished yellow cabs from other cars by their color:
This approach has many other applications. For instance, you could use a similar technique to quantify how often bike lanes are blocked by other vehicles. (On one day of the hackathon, two buses not only collided in a bike lane at 6th Avenue and 34th Street, but proceeded to block the lane for hours, presumably to sort things out.) You could also use this type of analysis to assess the impact of, say, a new curb policy on freight movement or a new crosswalk on pedestrian behavior.
In any case, knowing what goes on at a busy urban intersection is the first step toward making it flow more freely for all.