Pixxel’s hyperspectral orbital imagery attracts investment from Google

Hyperspectral imagery startup Pixxel has closed $36 million in funding as it prepares to roll out new remote sensing and analytics capabilities to customers.

The L.A. and Bangalore-based startup also scored a new strategic investor: Google, the tech giant that’s as well known for its mapping products as it is for its search engine. While Google led the Series B funding round, this does not mark the start of its relationship with Pixxel, CEO Awais Ahmed said in a recent interview.

“We were already working with them as a client before this,” he explained, with an AI research team from Google employing Pixxel’s hyperspectral data in agricultural applications. Google also rolled out its Earth Engine last year, a powerful tool that gives governments and businesses access to a massive trove of Earth observation data from hundreds of sensors in orbit. Many of Pixxel’s users separately use Earth Engine, Ahmed said, and the ultimate goal is to integrate the startup’s data onto that service.

Hyperspectral imaging uses a spectrometer to identify the spectral signature of objects. Taken from space, this type of imaging unlocks an enormous degree of insight into our planet – from detecting gas leaks to identifying specific types of minerals or plants. Pixxel has been developing this technology since 2019, and it put three demonstration satellites into orbit last year.

The startup has been selling data to a number of customers, including the U.S. National Reconnaissance Office. But “to increase the capacity and to actually reach a point that we can be self-sustaining with revenue,” Ahmed said, Pixxel’s team is now focused on launching its next-gen Firefly constellation. Those satellites will be able to provide 5-meter resolution over most of the Earth, as opposed to 10-meters from the demo satellites. (Ahmed points out that even 10-meters is the highest resolution hyperspectral sensor that’s ever operated in space.)

The Fireflies also have a longer lifespan: from two years to seven years. They are heftier too – 50 kilograms versus 15 kilograms – likely due to increased on-board propulsion. A trio of Firefly satellites will launch in early 2024 with SpaceX, and Pixxel plans to launch another three satellites shortly after. The company intends to launch 18 additional satellites by 2025.

Pixxel’s other major focus has been developing the Aurora analytics platform, which will allow customers to identify the spectral signature of an object with a click of a button. Different model tools will be built-in to the platform, like a crop species identification model, a cloud removal model, and a model to notify about gas leaks. Customers can use Aurora to track specific areas over time and to generate weekly reports on changes over those periods.

“It’s important for us to not just dump data down to our customers and have them figure it out themselves,” Ahmed said. “There’s very few people in the world with the skill set to actually analyze hyperspectral data so we realized to actually open it up to a lot more customers than would be possible without it, we will build and put the Aurora platform out.”

Crucially, the new capital gives Pixxel enough runway to focus on execution and generating revenue, and not falling to the “valley of death” that annihilates many startups, Ahmed said. The $36 million will see through the manufacturing of the first six Firefly satellites and the first launch next year, as well as the development of the Aurora platform.

Ahmed also revealed that some of the cash is going to development of the next version of its satellites, called Honeybees, which will be even larger and provide even greater resolution. In addition to Google, existing investors Radical Ventures, Lightspeed, Blume Ventures, GrowX, Sparta and Athera also participated in the round. Pixxel has now raised $71 million to-date.

Ahmed said he could foresee a future where hyperspectral is as accessible to the average person as optical satellite imagery is today.

“Right now, you go to Google Earth and look at your houses and roads,” he said. But in the future, one may be able to easily access hyperspectral data to “go to a particular area and see how much metal has changed or how much forest has decreased, or be able to hover over something and identify [it].”

“I think that’s the future.”