Bootstrapping your New Space Start-Up: a Geospatial Perspective
Written by: Dan Pilone, Element 84
At Element 84, our background in geospatial software has allowed us to work with a non-trivial number of up-and-coming space start-ups, often before they publicly enter the stage. With a strong interest in open source ideologies, we’re excited about positioning new space start-ups so they can make huge strides – and ultimately make it big in the industry.
With the conference circuit spinning back up for the year, I had the chance to chat with a number of folks from new space start-ups starting to get the ball rolling. For anyone else grappling with the same questions, I’ve outlined three of the most important keys to success from a geospatial perspective. Geospatial tips for your new space start-up
1. Time to Market: Don’t build anything you absolutely don’t have to.
You’re in the middle of a remote sensing and satellite gold rush. Focus on building the pieces where you provide real innovation and value. Using the “gold rush” analogy, now is not the time to build a pick-axe factory. While your new business is just getting started in the remote sensing and geospatial space, others have helped pave the way. The architecture, infrastructure, APIs, and even implemented tooling for large scale geospatial ingest, archive, processing, and discovery largely exist. Don’t rebuild the tooling! Before you start building anything, ask yourself if you’re really building a Platform(™) or are you just building a mountain of technical debt you’re going to need to maintain later?
Chances are, there will be a point down the road where you need to build more custom pieces or expand and enhance an existing component of your business. Until then, wait and take this time to learn what those pieces might be; leveraging existing community components in the meantime. Open source resources are out there, and, if you dig into it, you might find that 80% of the software you’re looking for is already available to you.
In addition to viewing “open source” exclusively through a software lens, open source (or open specification) resources are abundant and relevant when it comes to industry standards as well. During your development phases, learn what the community standards are in your field if you don’t know them already – and embrace them. It’s far easier to engage with a potential customer or partner that you’re already interoperable with than to spend time explaining how to use your custom search API, metadata, data formats, access patterns, tasking APIs, etc.
2. Size of Market: Understand your customer(s) and reach them where they are
This whole space is still sorting itself out. Are you flying your own instruments and selling low level products to resellers? Are you selling to analytics companies? Are you selling to organizations in totally different domains like farming or insurance? Each of those requires a specific set of vocabulary, have their own favorite data formats and workflows, etc.
Aligning with existing standards is the quickest way to get instantly compatible customers in a lot of cases, so use them if they apply: STAC, COGs, COPC, etc.
Since part of this industry is still figuring itself out, you’ll likely find customers who are saying “Just send me raw data!”. When you send them a 50GB SLC SAR scene, you’re going to get a follow-up call. The datasets are large, they can be complicated to use, and are often just a piece of a bigger solution. Be prepared to have the right people to engage with the right customers and know how to package and distribute your offerings to those individuals.
Understand how key things like latency, revisit, AOI size, etc. impact any given customer. Spending a bunch of money to chase single minute latency when your user is going to combine it with something like an 18 day composite might not be the best plan. Conversely, if you have a 3 hour latency and anything longer than 30 minutes is useless to your customer, you’ll need to figure out how to address that.
3. Scale of Market: What you’re doing could be of global scale … or more.
The Geospatial and remote sensing space operates at a massive scale, for you and your users. NASA’s Earth Observation archive is currently over 50PBs and is on track to hit 200PBs in the next few years. At Element 84 we maintain nearly 30PBs of open data in the AWS Open Data Program and support commercial operators already working at the petabyte scale. AWS can provide the infrastructure, but how are you going to handle your archive? How are your users going to interact with it? How are they going to manage the data and information they’re looking to buy from you? What does the cost model look like for it? When do you store data vs. process or reprocess it on demand?
Custom one-off processing can be done by a human, but how do your customers consistently monitor a specific area of interest (AOI)? 10 AOIs? 1000 AOIs? Combine that with AIS data, hyperspectral information, RF data, etc. How do you and your customers do this at a planet scale and with a sustainable cost model?
While you’re still getting started it’s important to make key distinctions for your business. A few questions that can help tease out risks and opportunities are:
Are you a data multiplier or a data reducer? Do you produce higher level products and derived information or do you expect your customers to? How does your cost model scale with these answers?
Are you providing direct access to your data in place? Are you copying it for users? What kind of authentication and authorization controls do you need in place and would they work for your customers?
What network(s) / regions / classifications / facilities do you need to exist on? How do you get data in and out?
Is processing and analysis something you do for your customers? Are there triggers or events that are critical to a customer (e.g. a cloud free scene over this area or automated change detection identified a region of interest)? Are you a taskable solution where customers can ask you not just for what data you have, but what data they want?
Do you need a partner network to help customers build solutions using your data? Where does your offering fit with respect to solutions using a mixture of open, commercial, and proprietary data?
What comes next?
If you’re just getting started, the best thing to do is to learn from companies that have come before you, and figure out where you fit into the bigger ecosystem. If you’re looking for open source solutions to jump start your process, check out FilmDrop for geospatial infrastructure and Raster Vision, our machine learning library. If any of the tips we outlined in this post resonate with you, we’re happy to talk! Our team will share what we know - the good and the bad. You can find us anytime here, or give us a shout on social.