AI-powered galaxy hunters add to the global GPU crisis


NASA announced that it will launch the Nancy Grace Roman Space Telescope into orbit in September 2026, eight months ahead of schedule. The new space telescope is expected to provide 20,000 terabytes of data to astronomers over its lifetime.

This will add to the 57 gigabytes of stunning images sent back daily from the James Webb Space Telescope, which began operating in 2021, and the start of scanning later this year by the Vera C. Rubin Observatory in the mountains of Chile, which is expected to collect 20 terabytes of data each night.

For comparison, the Hubble Space Telescope, once the gold standard, provides only 1 to 2 gigabytes of sensor readings each day. It’s been a long time since all these readings have been manually checked, but like everyone else with a pile of data, astronomers are now turning to graphics processing units to solve their problems.

Brant Robertson, an astrophysicist at the University of California, Santa Cruz, has had a front-row seat to this change in science while supporting or using data from these missions. Robertson has spent the past 15 years working with Nvidia to apply GPUs to problems in understanding space, first through advanced simulations to test theories about supernova explosions, and now developing the tools needed to analyze torrents of data from state-of-the-art observatories.

“There’s been this evolution[from]looking at a small number of objects, to doing CPU-driven analyzes on large scales of a dataset, and then doing GPU-accelerated versions of those same analyses,” he told TechCrunch.

Robertson and then-graduate student Ryanhausen developed a deep learning model called Morpheus that could study large data sets and identify galaxies. Early AI analysis of Webb’s data has identified a surprising number of a certain type of disk galaxy, and added a new wrinkle to theories about the evolution of our universe.

Now Morpheus is changing with the times: Robertson is shifting his architecture from convolutional neural networks to the transformers behind the emergence of large language models. This will cause the model to be able to analyze the space several times more than it can currently, speeding up its work.

TechCrunch event

San Francisco, California
|
October 13-15, 2026

Robertson is also working on generative AI models trained on space telescope data to improve the quality of observations collected by ground-based telescopes, which are distorted by the Earth’s atmosphere. Despite advances in rocketry, it’s still difficult to put an 8-meter mirror into orbit, so using software to improve Rubin’s observations is the next best thing.

But it’s still feeling the pressure of global demand for GPU access. Robertson used the National Science Foundation to build a GPU cluster at UC Santa Cruz, but it has become outdated, even as more researchers want to apply computationally intensive techniques to their work. The Trump administration has proposed cutting the National Salvation Front’s budget by 50% in its current budget request.

“People want to do AI and machine learning analytics, and GPUs are really the way to do that,” Robertson said. “You have to be entrepreneurial… especially when you’re working on the edge of technology. Universities are very risk-averse because they have limited resources, so you have to go out and show them that, ‘Look, this is where we’re going as a field.'”

When you make a purchase through the links in our articles, We may earn a small commission. This does not affect our editorial independence.

Leave a Reply

Your email address will not be published. Required fields are marked *