Ben Fielding: Decentralizing Machine Intelligence

It started at a noisy desk. The desk is a wooden cubicle in a lab at Northumbria University, in northern England, where an AI researcher started his PhD track. It was in 2015. The researcher was Ben Fielding, who built a large machine stuffed by the earlier GPUs to form AI. The machine was so strong that it annoyed the fielding lab-mates. The field stuffed the machine under the desk, but it was so great that he had to awkwardly stick his legs to the side.
The field has some unworthy ideas. He researched how AI’s “swarms” – clusters of many different models – can talk to each other and learn from each other, which can improve the collective whole. There is only one problem: he was awarded the facts of the noisy machine under his desk. And he knew he had been -outgunned. “Google is doing this research too,” Fielding said now. “And they have thousands (of GPUs) in a data center. The things they do are not crazy. I know the procedures … I have a lot of proposals, but I can’t run it.”
Ben Fielding, CEO of Gennsn, is a speaker at Consensus 2025 in Toronto.
Jeff Wilser is the host of THE PEOPLE’S AI: The Decentralized AI Podcast and Make -Host of the AI Summit at Consensus 2025.
So a decade ago, it appeared in the field: Compute barriers will always be an issue. In 2015, he knew that if the compute was a difficult forcing the academy, it would be completely a difficult forcing when AI went.
The solution?
Decentralized Ai.
The field established by Gennyn (with Harry Grieve) in 2020, or years before the decentralized AI became fashionable. The project was first known for developing a decentralized compute – and I talked with a field about it For CoinDesk And in the panel after the panel at conferences – but the vision is actually something wider: “The Network for Machine Intelligence.” They build solutions up and down the tech stack.
And now, a decade after the noisy fielding desk that annoyed his lab-mates, the first Genns tools were in the wild. Gennyn recently released the “RL Swarms” protocol (a descendant of fielding PhD) and just launched Testnet – bringing a blockchain to the fold.
In this conversation leading to the AI summit, to the consensus in TorontoThe field provides a primer to the AI swarms, explaining how blockchains are joking in the puzzle, and share why all changes – not just giant tech – “should have the right to develop machine learning technologies.”
This interview is reserved and lightly edited for clarity.
Congrats on Testnet launch. What is the gist of what is it?
Ben Fielding: This is the addition of MVP’s first features of integration of blockchain to what we have launched to this day.
What are the original features, pre-blockchain?
So we launched the RL (Reinforcement Learning) Swarm a few weeks ago, which was the study of reinforcement, post-training, as a peer-to-peer network.
Here’s the easiest way to think about it. When a pre-trained model goes through the Reason-R1-like practice-it learns to evaluate one’s own thinking and recursively improve against the task. It can improve its own answer.
We do that process one more step and say, “It’s great for models to assess their own thinking and improvement to improve. What if they can talk to other models and criticize each other’s thinking?” If you get a lot of models in a group that can talk to each other, they can start learning how to send information to other models … with the general purpose of improving the whole swarm itself.
Gotcha, which explains the name “Swarm.”
Right. This is this method of training that allows many models to combine, consistently, to improve the outcome of a final meta-model that you can create from those models. But at the same time, you have every single individual model that only improves on its own. So if you go with a model in a MacBook, join a lot of an hour and then drop again, you will have an enhanced local model based on flowering knowledge, and you will also improve other models in spinning. This is the process of working in collaboration that can join any model and can do any model. So that’s the RL Swarm.
Okay, so that’s what you released a few weeks ago. Now where does the blockchain come in?
So the blockchain is advancing us some of the lower levels of the system.
Let’s go Pretend that a person does not understand the phrase “lower level of primitive.” What do you mean by that?
Yes, so I mean, very close to the resource itself. So if you think about the software stack, you have a GPU stack in a data center. You have drivers on top of GPU. You have operating systems, virtual machines. You’ve got everything this thing.
So a lower primitive level is the closest to the bottom foundation in the tech stack. Am I right?
Yes, exactly. And RL Swarm is a display of what’s possible, really. This is a pretty hacky demo of making a really interesting large, scalable machine learning. But what Gennsn has been doing in the past four plus years, realistic, is building infrastructure. And so in this time today where the infrastructure is at all levels of V0.1 types of beta. Everything is over. It’s ready to go. We need to know how to show the world what is possible when it is a big move in the way people think about learning the machine.
Do you seem to be doing men more than decentralized compute, or even infrastructure?
We have three main ingredients sitting under our infrastructure. Implementation – we have the same libraries of implementation. We have our own registry. We have the can be copied libraries for any target hardware.
The second piece is communication. So suppose you can just run a model on any device in the world compatible, can you get them to talk to each other? If everyone chooses the same standard, everyone can talk like TCP/IP from the Internet, really. So we build those libraries and RL Swarm is an example of that communication.
And then, finally, verify.
Ah, and I’m predicting that this is where the blockchain enters …
Imagine a scenario where every device in the world continues to implement. They can be linked together. But can they trust each other? If I connect my MacBook to you, yes, they can perform the same tasks. Yes, they can send tensors back -back, but do they know that what they sent to another device really happens on the other device or not?
In the current world, you and I can sign a contract to say, yes, we agree that we will make sure our devices do the right thing. In the machine world, it needs to happen programmatically. So that is the final piece we are building, cryptographic proofs, probabilistic proofs, proof theoretical games to make the process fully programmatic.
So where the blockchain enters. It gives us all the benefits of blockchain you can imagine, such as constant identity, payment, agreement, etc. Ledger. ‘
In the future you will have the ability to make payments, but now, you have a trusted consensus mechanism where we can terminate misunderstandings. So, this is kind of an MVP of future genns infrastructure, where we will add ingredients as we go.
Give us a teasing of what’s going down the pipeline?
When we reach the main-net, all software and infrastructure are live against the blockchain as a source of trust, payment, consent, etc., identification. This is the first step of it. It adds identity and says when you join a lot, you can register as the same person. Everyone knows who you don’t have to check some centralized servers or websites in one place.
Now it’s wild and talk more in the future. What does it look like a year from now, two years from now, five years from now? What is your North Star?
Sure. The final vision is to get all the resources to sit under the study of the machine and make them immediately programmatically accessible to everyone. The study of the machine is extremely compelled by its basic resources. This creates a huge moat for AI centralized companies, but it does not have to exist. It can be open-sourced if we can build the right software. So our perspective is that Gennsyn builds all low levels of infrastructure to allow that to get as close to open resources as possible. People should have the right to develop machine learning technologies.