Google has released a massive dataset of building footprints, extracted from high resolution satellite imagery:
The dataset contains 1.8 billion building detections, across an inference area of 58M km2 within Africa, South Asia, South-East Asia, Latin America and the Caribbean.
1.8 billion features is enormous, and it doesn’t even cover North America, Western Europe, most of China, Japan, or Australia. Incredible stuff, licensed under CC-BY-4 and ODbL.
Geoff Zeiss on combining satellite imagery and spatial analysis to identify tree encroachment in utilities:
Transmission line inspections are essential in ensuring grid reliability and resilience. They are generally performed by manned helicopters often together with a ground crew. There are serious safety issues when inspections are conducted by helicopter. Data may be collected with cameras and analyzed to detect a variety of conditions including corrosion, evidence of flash over, cracks in cross arms, and right-of-way issues such as vegetation encroachment. in North America annual inspections are mandated by NERC and are not optional. With over 200,000 miles of high-voltage transmission lines and 5.5 million miles of distribution lines in the United States, improving the efficiency and reducing the risk of inspections would have a major impact on the reliability of the power grid.
Google Sheets now supports using BigQuery data inside of Google Sheets features like pivot tables and formulas, which means orders-of-magnitude increase in data limits.
My goal tracking efforts pale in comparison to what Julian Lehr is doing. I might give a try to Airtable for mine, also. I’ve been in Google Sheets since mine’s pretty basic, but AT might make it more mobile-friendly for editing.
What stands out most is Tesla’s integrated central control unit, or “full self-driving computer.” Also known as Hardware 3, this little piece of tech is the company’s biggest weapon in the burgeoning EV market. It could end the auto industry supply chain as we know it.
One stunned engineer from a major Japanese automaker examined the computer and declared, “We cannot do it.”
This is a great post from my friend Joe Morrison assessing the state of the open source software movement in GIS against the biggest corporate competitive friction point, Esri.
Even as QGIS and comparable open source projects have invested in better branding, that doesn’t mitigate the ultimate killer in the quest to replace Esri at most large organizations: misaligned incentives. The typical IT department at a large company or government agency is defensive by nature due to the asymmetrical risk of adopting new technology; you can get fired for championing a new software that winds up becoming a costly failure, but it’s tough to get fired for preserving the status quo.
He nails just about every point I’d make when you realistically compare what challenges are there for not only open source projects, but even just thinking about competitors when you’re a new entrant coming into an already-mature space.
I think inertia and incentives are such massive forces, bigger than most of us technologists realize at first, that have to be overcome before you’ll get any traction in a wide marketplace. Even if you’re coming at a problem in a largely new way (creating a new category of software), users will still prefer to find the closest comparable analog to stack you up against, even if unfairly. To reach beyond the “early adopters” and tinkerers, you have to somehow overcome these challenges — get good at telling your story, be 10X better (3X or 5X usually isn’t enough), and have the persistence and patience to chew away at the problem for a long time. The chasm is real, and taking the wheel away from the incumbents is an enormous undertaking.
An RTIN mesh consists of only right-angle triangles, which makes it less precise than Delaunay-based TIN meshes, requiring more triangles to approximate the same surface. But RTIN has two significant advantages:
The algorithm generates a hierarchy of all approximations of varying precisions — after running it once, you can quickly retrieve a mesh for any given level of detail.
It’s very fast, making it viable for client-side meshing from raster terrain tiles. Surprisingly, I haven’t found any prior attempts to do it in the browser.
This is an interesting piece on the Figma blog about Notion and their design process in getting the v1 off the ground a few years ago. I’ve been using Notion for a while and can attest to the craftsmanship in design and user experience. All the effort put in and iterated on really shows in how fluid the whole app feels.
I’m always a sucker for a curated list of reading recommendations. This one’s from Stripe founder Patrick Collison, who seems to share a lot my interests and curiosities.
Our friend and colleague Kurt Menke of Bird’s Eye View GIS recently conducted a workshop in Hawaii working with folks from the Pacific Islands (Samoa, Marianas, Palau, and others) to teach Fulcrum data collection and QGIS for mapping. Seeing our tech have these kinds of impacts is always enjoyable to read about:
The week was a reminder of how those of us working with technology day-to-day sometimes take it for granted. Everyone was super excited to have this training. It was also a lesson in how resource rich we are on the continent. One of my goals with Bird’s Eye View is to use technology to help make the world a better place. (Thus my focus on conservation, public health and education.) One of the goals of the Community Health Maps program is to empower people with technology. This week fulfilled both and was very gratifying.
Most of the trainees had little to no GIS training yet instantly knew how mapping could apply to their work and lives. They want to map everything related to hurricane relief, salt water resistant taro farms, infrastructure related to mosquito outbreaks etc. A benefit of having the community do this is that they can be in charge of their own data and it helps build community relationships.
This week we’ve had Kurt Menke in the office (of Bird’s Eye View GIS) providing a guided training workshop for QGIS, the canonical open source GIS suite.
It’s been a great first two days covering a wide range of topics from his book titled Discovering QGIS 3.
The team attending the workshop is a diverse group with varied backgrounds. Most are GIS professionals using this as a means to get a comprehensive overview of the basics of “what’s in the box” on QGIS. All of the GIS folks have the requisite background using Esri tools throughout their training, but some of us that have been playing in the FOSS4G space for longer have been exposed to and used QGIS for years for getting work done. We’ve also got a half dozen folks in the session from our dev team that know their way around Ruby and Python, but don’t have any formal GIS training in their background. This is a great way to get folks exposure to the core principles and technology in the GIS professional’s toolkit.
Kurt’s course is an excellent overview that covers the ins and outs of using QGIS for geoprocessing and analysis, and touches on lots of the essentials of GIS (the discipline) and along the way. All of your basics are in there — clips / unions / intersects and other geoprocesses, data management, editing, attribute calculations (with some advanced expression-based stuff), joins and relates, and a deep dive on all of the powerful symbology and labeling engines built into QGIS these days1.
The last segment of the workshop is going to cover movement data with the Time Manager extension and some other visualization techniques.
Hat tip to Niall Dawson of North Road Geographics (as well as the rest of the contributor community) for all of the amazing development that’s gone into the 3.x release of QGIS! ↩
We recently began a corporate sponsorship for the QGIS project, the canonical open source desktop GIS. I got back into doing some casual cartography work using QGIS back in December after a years-long hiatus of using it much (I don’t get to do much actual work with data these days). Bill wrote up a quick post about how we rely on it every day in our work:
As a Mac shop, the availability of a sophisticated, high-quality, powerful desktop GIS that runs natively on MacOS is important to us. QGIS fits that bill and more. With its powerful native capability and its rich third-party extension ecosystem, QGIS provides a toolset that would cause some proprietary software, between core applications and the add-on extensions needed to replicate the full power of QGIS, to cost well more than a QGIS sponsorship.
The team over at Lutra Consulting, along with the rest of the amazing developer community around QGIS have been doing fantastic work building a true cross-platform application that stacks up against any other desktop GIS suite.
I pretty much watch Esri from a distance these days but, even from my perch, I can’t see any workflow that doesn’t involve desktop software at some point. It’s simply still crucial to the editing and publishing of maps and other geospatial content.
That’s even true outside of the Esri world. My team lives in QGIS. As of this writing, QGIS is at version 3.4.4 and it’s hard for me to make a case that I need to use anything else, aside from convenient proprietary hooks to other systems and workflows. We do have a lone ArcMap instance for the production of MXDs, but we make heavy use of FME so we can produce data in even the most proprietary of formats if needed, without consuming a license of anything additional.
In our universe we work so much with creating data, QA, cleaning, packaging, and analysis that desktop GIS is (as Bill says) still critical to the workflow.
I would add that I’ve been very impressed with the state of the QGIS product in my recent usage of it. It’s amazing that with community development, time, and step-by-step progress, an open source developer community can build such a powerful product. Of course QGIS is also underpinned by dozens of super-powerful libraries that are amazing utilities in and of themselves — GDAL, Proj4, GEOS, and more.
Our friends over at the Santa Barbara County Sheriff have been using a deployment of Fulcrum Community over the last month to log and track evacuations for flooding and debris flow risk throughout the county. They’ve deployed over 100 volunteers so far to go door-to-door and help residents evacuate safely. In their initial pilot they visited 1,500 residents. With this platform the County can monitor progress in real-time and maximize their resources to the areas that need the most attention.
“This app not only tremendously increase the accountability of our door-to-door notifications but also gave us a real time tracking on the progress of our teams. We believe it also reduced the time it has historically taken to complete such evacuation notices.”
This is exactly what we’re building Community to do: to help enable groups to collaborate and share field information rapidly for coordination, publish information to the public, and gather quantities of data through citizens and volunteers they couldn’t get on their own.
From Howard Butler is this amazing public dataset of LiDAR data from the USGS 3D Elevation Program. There’s an interactive version here where you can browse what’s available. Using this WebGL-based viewer you can even pan and zoom around in the point clouds. More info here in the open on GitHub.
Microsoft published this dataset of computer-generated building footprints, 125 million in all. Pretty incredible considering how much labor it’d take to produce with manual digitizing.
Each year GIS developer and cartographer Nyall Dawson puts together a thread of daily tweets leading up to Christmas, each with a helpful tip for QGIS. You can see all of them at the hashtag #24daysofqgis.
I’ve had R on my list for a long time to dig deeper with. A while back I set myself up with RStudio and went through some DataCamp stuff. This online book seems like excellent material in how to apply R to geostatistics.
Given where we are with Fulcrum in the product lifecycle, this rang very familiar on the struggles with how to listen to customers effectively, who to listen to, and how to absorb or deflect ideas. Once you get past product-market fit, the same tight connection between your customers and product team becomes impossible. Glad to hear we aren’t alone in our struggles here.
This piece from David Heinemeier Hansson is a good reminder that steady, linear growth is still great performance for a business. Every business puts itself in a different situation, and certainly many are in debt or investment positions that linear growth isn’t good enough for. Even so, consistent growth in the positive direction should always be commended.