If you’ve ever had a PET scan, you know it’s an ordeal. Scans can help doctors detect cancer and track its spread, but the process itself is a logistical nightmare for patients.
It begins with a 4-6 hour fast before coming to the hospital. If you live in a rural area and your local hospital doesn’t have a PET scanner, you’re in luck. Once you arrive at the hospital, you will be injected with radioactive material, and then you will have to wait an hour for the radioactive material to wash out of your body. You then enter the PET scanner and have to lie still for 30 minutes while the radiologist takes the images. After that, you must physically distance yourself from the elderly, young people and pregnant women for up to 12 hours, as they are literally semi-radioactive.
Another bottleneck? PET scanners are concentrated in large cities. That’s because radioactive tracers must be generated in a nearby cyclotron (a small nuclear-powered machine) and used within hours, limiting access in rural and regional hospitals.
But what if AI could be used to convert CT scans into PET scans that are far more accessible and affordable? That’s the pitch of RADiCAIT, an Oxford spinout that emerged from stealth this month with $1.7 million in pre-seed funding. The Boston-based startup, a top 20 finalist in TechCrunch Disrupt 2025’s Startup Battlefield, has just begun raising $5 million in funding to advance its clinical trials.
“What we’re really doing is taking the most constrained, complex, and expensive medical imaging solution in radiology and replacing it with the most accessible, simple, and affordable one: CT,” Sean Walsh, CEO of RADiCAIT, told TechCrunch.
RADiCAIT’s secret sauce is its underlying model. Generative deep neural networks were invented at the University of Oxford in 2021 by a team led by Regent Lee, the startup’s co-founder and chief medical information officer.

The model learns by comparing CT and PET scans, mapping them, and extracting patterns in how they relate to each other. Sheena Shahande, RADiCAIT’s chief engineer, describes it as connecting “disparate physical phenomena” by translating anatomical structures into physiological functions. The model is then instructed to pay particular attention to certain features or aspects of the scan, such as certain types of tissue or abnormalities. This focused learning is repeated many times with different examples, allowing the model to identify which patterns are clinically important.
tech crunch event
san francisco
|
October 27-29, 2025
The final image seen by the doctor is created by combining multiple models that work together. Shahande compares this approach to Google DeepMind’s AlphaFold, an AI that has revolutionized protein structure prediction. Both systems learn to convert one type of biological information into another type of biological information.
Walsh claims that the RADiCAIT team can mathematically prove that the synthesized or generated PET images are statistically similar to actual chemical PET scans.
“What our trial showed was that the same quality of decision-making was made when physicians, radiologists, and oncologists received chemical PET or (AI-generated PET),” he said.
RADiCAIT does not promise to replace the need for PET scans in certain treatment settings, such as radioligand therapy to kill cancer cells. However, for diagnostic, staging, and monitoring purposes, RADiCAIT’s technology may make PET scans obsolete.

“It’s a very constrained system, so we don’t have enough supply to meet the demand for diagnostics and treatments,” Walsh said, referring to a medical approach that combines diagnostic imaging (i.e., PET scans) with targeted therapies to treat disease (i.e., cancer). “So what we’re trying to do is absorb the demand on the diagnostic side. The PET scanner itself has to fill the gap on the diagnostic side.”
RADiCAIT has already begun clinical trials focused on lung cancer testing in collaboration with major health systems such as Mass General Brigham and UCSF Health. The startup is currently undergoing FDA clinical trials. This clinical trial is a more expensive and complex process, driving RADiCAIT’s $5 million seed round. Once it is approved, the next step will be to conduct a commercial pilot and demonstrate the commercial viability of the product. RADiCAIT would like to follow the same process (clinical pilot, clinical trial, commercial pilot) for the colorectal and lymphoma use cases.
Shahande said RADiCAIT’s approach of using AI to gain valid insights without the burden of difficult and expensive testing is “broadly applicable.”
“We are looking at expanding the entire radiology department,” Shahande added. “We hope to see similar innovations that connect disciplines from materials science to biology, chemistry, and physics wherever we learn about nature’s hidden relationships.”
To learn more about RADiCAIT, attend Disrupt in San Francisco from October 27th to 29th. Click here for more information.

