Skip to main content

A New Way of Designing Global Satellite Missions


Aerial photograph from space of the world focusing on Italy's landmass and glowing lights indicating larger cities

Total Earth continuous satellite coverage. With four satellites.

While using four satellites may sound simple, it is actually very complex. Building upon remote sensing work focused on global flooding they conducted using NCSA’s Blue Waters supercomputer in 2015, Patrick Reed, the Joseph C. Ford Professor of Engineering at Cornell University, and his team discovered the right combination of factors to make a four-satellite constellation not only possible but economically feasible. The publication of the team’s results earlier this year, including in Nature Communications, has had the science community buzzing. Reed believes a large part of the response to the work is its practicality. This work has also recently been named one of eight global finalists for top human competitive result using genetic and evolutionary computation in 2020 (the Humies).

Reed and his collaborators from The Aerospace Corporation made the forces that ordinarily degrade satellites instead work in their favor. The solution involves launching the satellites into a higher elevation, which puts more forces on the satellites; forces from the sun, the ocean tides, the irregular mass distribution of Earth and other small non-linear forces that accumulate over time.

Normally, these forces would lead to performance degradation, but Reed and his team discovered that if they initially accounted for these forces, they could use them as a source of energy and control. This dramatically increases the lifecycle of the mission while dramatically reducing the fuel requirements, sometimes as much as 60 percent, while maintaining satellite coverage. In the Nature Communications paper, results ranged from 86 percent to ninety-five percent of whole-Earth coverage.

What this means, says Reed, is that “you get a long life, low cost global mission, which I would say is well-posed in general for space-based Earth observation applications.” Other applications potentially could include navigation or space-to-space communication. One thing it’s not suited for, says Reed, is providing satellite high-speed internet services. The constellation is too high and has too much latency. Although the latency is about half a second, “below the human threshold of perception” he explains, which has value for communications but is too long for high-speed internet service.

Our system doesn’t get perfect results, but it gets very strong results. The distinguishing feature of ours is that it could be implemented in the real world. I think one of the confusing things for people who read about our work, they think it is a specific satellite mission. And it’s not that. It’s a new way of designing globally-focused satellite missions in general.

Patrick Reed, Engineering Professor, Cornell University

THE BACK STORY

The idea of using only four satellites for global coverage isn’t new. In the mid-1980’s, John Draim patented a four-satellite, total Earth coverage constellation. He provided geometric proof such a satellite constellation was possible, says Reed, “but it wasn’t long before others came behind and said, well, if we actually implemented this, you’d lose up to 60 percent of the coverage performance relatively quickly. You’d have to put a huge amount of fuel on the satellites, and have to do a lot of active control to keep them on mission. So nobody built it.”

The reason that the satellites degrade, explains Reed, is because when Draim did the original orbital mechanics and design it was based on the mathematical proof but “it didn’t get into issues in terms of non-linear forces.”

So Reed and his team decided to flip the problem.

DISCOVERY WAS YEARS IN THE MAKING

Reed’s team received a Petascale Resource Allocation (PRAC) from the National Science Foundation (NSF) in 2015 to use the Blue Waters supercomputer at NCSA. Since they were collaborating with Princeton’s Eric Wood, a civil engineer with a focus on hydroclimatology, the team focused on flooding. They used their PRAC to do a lot of global hydrologic modelling, and ran one of the largest experiments of that kind ever conducted.

“We ended up doing a 10,000 member ensemble of one degree hydrologic simulation for the whole planet, and then looked at what you would need to resolve at specific spatial and temporal data needs,” says Reed. “And then we changed what the Aerospace collaborators would do, and resolved their metrics to focus on the actual hydrologic need. And the short of that was it demonstrated that even given that you had 10 satellites, if you lost some—which at the time was occurring, decommissioning four of the 10—you could actually outperform all 10 by coordinating the planning across global agencies or national agencies. And you could actually outperform it with eight satellites.”

With their Blue Waters PRAC research, Reed and his colleagues were the first to formally link high-resolution astrodynamics design and coordination of space assets with their Earth science impacts within a petascale “many-objective” global optimization framework. Their PRAC results required over 144 million core hours and the systematic processing of approximately two petabytes of model output.

That work was reported in one of their first papers, and as it progressed it became clear that more data and cooperation was needed. Because their PRAC was not renewed, they no longer had access to run simulations on Blue Waters, although they could continue to run analyses on their original data. Lack of access to the petascale speed of Blue Waters to run simulations protracted their research, but the team persevered using allocations on less powerful systems through XSEDE (the Extreme Science and Engineering Discovery Environment).

“One of the things the team wanted to do is end on a high note and try to resolve this,” says Reed. “And so what we ended up doing was something we had put in our NSF-rejected renewal [PRAC] proposal and had always intended. If you think about Draim, in the 80s, he patented that you need four satellite, and that’s a classic result. Geometric proof. But it wasn’t feasible.”

The typical satellite system design, says Reed, is basically the same regardless of intended purpose. There may be some tweaks to meet certain goals, but the metrics for coverage and performance are usually the same. By beginning with a different approach, looking at the data needed for endpoint science, then designing a satellite mission that would meet that goal, the team attained the success they were seeking in that area.

THE FUTURE

Reed says their work lays the foundation for different approaches to Earth science. With a four-satellite global constellation, researchers could go beyond continuous observations “that are limited in their scope to small regional footprints.” And countries could potentially collaborate and share costs. This would allow poorer countries who normally wouldn’t be able to do remote Earth sensing to have access to data.

Even after coverage is achieved, however, constraints, particularly around big data, will remain.

“You have the potential of generating so much information that you can’t even deal with it,” says Reed. “I mean, you’re talking about continuous coverage of 86 to 95 percent of the planet where whatever you’re observing is returning, and that’s going to be constrained by your communications system, your information system, your ground systems, your sensing systems.”

The team is considering what they want to focus on next and how high-performance computing will play into their research. Reed says they’re more focused on “actually using this style of hero class experiments to have consequential real-world results that change how we view, manage, design our coupled human-natural systems to positively impact people’s lives, things along that line. And so we’re navigating that in terms of what future space architectures could or should look like.”


ABOUT NCSA

The National Center for Supercomputing Applications at the University of Illinois Urbana-Champaign provides supercomputing and advanced digital resources for the nation’s science enterprise. At NCSA, University of Illinois faculty, staff, students and collaborators from around the globe use these resources to address research challenges for the benefit of science and society. NCSA has been advancing many of the world’s industry giants for over 35 years by bringing industry, researchers and students together to solve grand challenges at rapid speed and scale.

Blue Waters is supported by the National Science Foundation through awards OCI-0725070 and ACI-1238993.

XSEDE is supported by National Science Foundation through award ACI-1053575.

Back to top