Welcome to the alpha please newsletter.
gm friends, today we are talking about Nillion.
Nillion is a fascinating project that I have been following for quite some time. Nillion is a secure computation network that decentralizes trust for high value data in the same way that blockchains decentralized transactions.
It’s another project that enables all sorts of new use cases, particularly in the realm of personalised AI.
I spoke with Tristan Litre, Nillion’s director of crypto, to get the alpha.
Nillion TL;DR:
Nillion is a secure computation network.
Traditional challenges of handling high value data include:
The need for secure storage
The ability to compute on stored data without compromising security
Achieving decentralization of data management
High value data stored in the Nillion Network can be computed on while staying hidden, unlocking new use cases and verticals; early Nillion builders from the community are building things like tooling for private predictive LLMs and secure storage and compute solutions for healthcare, passwords, and trading data.
What is Nillion and what problems are you looking to solve?
Nillion decentralizes trust for sensitive data. The foundation of blockchain is the decentralization of transactions and payments, which is where Bitcoin originated. We've since added logic on top, but primarily in the context of transactions. Interactions on chain are still predominantly referred to as transactions by the majority of chains. The logic on chain is constrained by the original architecture of blockchains, designed to maintain a single, trustless, shared state across the entire network. That’s awesome and it lets us do great things, which we even leverage ourselves. However, we're aiming to focus on a completely different realm of use cases that could be decentralized, which is data.
Early on, we were thinking about all the things you could do with this. And I think for a lot of people, especially non-crypto people, when you explain to them what Nillion does with the basis of privacy-enhancing technologies, they jump to some of the cool real-world use cases you can use this stuff for, like healthcare data. It's always visceral to be like, well, one day if all the hospitals could, in a privacy-preserving way, share their data with each other, all the biopharma institutions could be two steps closer to curing diseases. And those things are true. But I think what really has happened with this shift towards AI, not just for us but for everyone, is that we've seen what was already happening with technology in general, which is this erosion of privacy and security, increasing amounts of trust in centralized big tech institutions, even the government is at an inflection point that might just start slipping way faster now.
If AI becomes what lots of people are saying it will be and what I believe it will be, then data is in a weird place right now.
Your personal data isn't just used for training, you're inputting huge amounts into all of these systems today. It's concerning because it's unclear who has access and what they'll do with it. What about when these cyber security incidents happen? Imagine if your entire message history, every kind of sensitive data, private keys, emails, everything, were fed into these systems. That's the promise of personalized AI, like the movie Her. But to get there, we have to be very careful not to build a dystopia.
The goal of what we're doing is to decentralize the trust for sensitive data. The way we achieve this is by building a network that is distinct from traditional blockchain networks and also differs from what some might consider our competitors, such as AI networks or compute networks like Gensyn, or others.
We're focused on valuable data, which is data that cannot be openly stored on a decentralized network but is still needed for use. Today, you can encrypt data and store it on networks like Arweave, and then use another compute solution to decrypt and utilize it. However, whenever this data is in use, or when it's being transferred back to you, it becomes extremely vulnerable. We aim to address the security gaps that exist for data that is personal or critical to institutions, ensuring that your personal secrets and the proprietary information essential to a business are securely managed without these vulnerabilities.
I’ve seen you use the term “The world’s first blind computation network”, could you explain that?
We've been using this new tagline, "humanity's first blind computer." What we're trying to evoke with it, is that right now, if you trust cloud infrastructure, like Google Cloud, or AWS, so much of the internet, so much of the modern world, is built on a "just trust me, bro" assumption.
They're supposed to provide privacy for your app and for your users. And as long as you trust them for hosting servers and moving stuff around, you can provide privacy to your users. You can have user accounts so when we both log into Instagram, I can't see your DMs. You know, all the stuff that makes the internet work. But without that "just trust me, bro" assumption, suddenly a lot of the nice-to-haves, a lot of the functionality that people expect from the internet falls apart.
The power of a blind computer is that you get to do all those things, you get to rely on the network, all the other nice things about being a highly networked society, but you don't trust any individual computer or company. There's no Amazon at the centre of it that, if Papa Bezos gets really interested in what's happening on your server, there's probably nothing that can stop him from getting in there. And so that's the power of the blind computer. Eyes closed, but doing the work that you need it to do.
Could you outline some specific use cases for such a protocol?
The main one is AI. When we talk about personalized AI, we're not just talking about running an LLM on various networks. With personalized AI, you're inputting data that nobody should see, and that's where the need for the blind computer comes into play.
I think healthcare is one of the most interesting spaces, mainly because it's not just about people feeling their data is sensitive. There's also a whole layer of regulation and compliance that makes data sharing really complex. This creates a significant amount of friction, affecting how quickly we can collect data, make progress, and discover new things. It's this friction that impacts what we internally refer to as 'Sleeping Giant' industries. These are sectors that could greatly benefit from decentralized infrastructure but have been unable to fully engage with it due to these barriers.
Besides healthcare, there are other sectors like deep financial infrastructure where privacy is crucial. But I'd argue that privacy is important across all of crypto, really. It's a battle we need to keep fighting at the foundational level.
What are some of the blockchain specific applications that are possible with Nillion?
Today, I'd say one of the most exciting things within our reach involves the use of auctions as a primitive. With privacy features, you can enable all sorts of new auction types that haven't seen much use in crypto, except maybe in mid-pipeline applications. It's an interesting angle to explore and build on.
I think there's some really cool stuff happening in the realm of voting. DAOs, on the other hand, seem to be in a bit of an existential crisis at the moment. There's not much energy or optimism in that space, but I think it'll hit a turning point eventually. The kind of infrastructure we're working on offers more options, bridging the gap between these extremes and introducing features that could attract interest in ways we haven't seen before. It's not just about voting; it's about a broader set of tools that can enhance how decentralized organizations operate.
Then there's the realm of DeFi use cases. We've got someone on the team who's developing encrypted order books. This means you don't have to rely on a centralized institution to handle your order flow, your limit orders, and other details. But at the same time, you're not forced to lay all your cards on the table by putting everything on-chain, where a market maker might front-run your order and take your liquidity. There are many parts of trading that you might want to keep private. They're working on the engine for these order books, but I see it as a foundation that others can build on top of and apply it to crypto.
What does your roadmap look like for 2024?
So we're planning to host some hackathons where people can start to play with our technology. The goal for this year is to go full steam ahead and launch this network, making it available for people to interact with. That's why we went quiet for a bit, you know? We were really heads down, focused on this project.
The vision for what this infrastructure can achieve is huge, so we've been working hard to bring this first Beta version to life and get it into people’s hands to see what they can do with it. We're aiming to roll it out, at least in some form, and get builders on board pretty soon. Right now, it's more on the private side, with our business development team and ecosystem guys talking to potential builders non-stop. We've already spoken to over a hundred builders and have more announcements lined up about who's joining us. We're planning to open this up more, still keeping it somewhat private, but really starting to engage more with the community. Then we'll quickly move to making the SDK available, so developers can start experimenting locally, see how it works, and begin creating. From there, we'll invite them to the testnet, leading up to a public testnet and beyond.
How will your token play a role in the ecosystem if you plan to release one?
You can think of the token as a pretty normal part of how it works in a lot of other decentralized systems. It's an important part of the incentive structure and how users access network resources. It's also key to ensuring the network's resilience against DoS attacks and other stuff like that. So yeah, the network needs a token to function.
That's part of the reason we use blockchain technology in our system. We need a reliable way to coordinate payments and reach consensus on various operational aspects—not necessarily user data, but other critical components. Our cluster-based approach requires the whole network to be aware of specific details, like which clusters are servicing a user's request. This level of coordination is essential for the network's functioning.
Are there any projects building on Nillion that you think people should check out?
Everyone should definitely go and check out the ecosystem section on our website. Out of the ones on there, I think Rainfall is really interesting, because it really falls into that “personalized AI” category.
Another one I mentioned but not by name is ChooseK. They aren’t specifically a crypto team, but are an elite MPC privacy enhancing technology group who are building out the decentralized orderbook that I spoke about earlier. Those guys are killers and I really think this is going to be extremely useful and have a real chance at some great product market fit.
The one that's already the most advanced and are working on a POC with us is Mailchain. What Mailchain is doing is really impressive. In the crypto messaging stack, there are a few players, but it's still quite early days. The technology is evolving, but the space itself hasn't fully captured the attention or explored the integration use cases to its full potential yet. However, it has a real chance to make a significant impact, especially if integrated with emerging areas like gaming. Messaging is a fascinating primitive for developers to experiment with, and I think Mailchain is doing some pretty cool stuff in that space. I'd recommend people check it out.
How permissionless will the protocol be on launch?
This is something we've thought a lot about. Ultimately, we've decided to embark on what we call a pathway to “permissionlessness”, using a progressively decentralized approach.
We have this concept of horizontally scaling through clusters, which is somewhat like sharding in the blockchain world. This concept is fitting for our infrastructure because we don't require a single shared state across the entire network. Our aim is to leverage the advantages of decentralization, like having multiple nodes, to distribute trust. This means trust doesn't have to be uniform across the entire network all the time. There's a sort of calculus each participant might go through, weighing the crypto-economic incentives, like how much is at stake for these nodes and how much they're backed by others.
Reputation matters a lot though. It's about who is running these nodes. These nodes are out there, in a proof of stake system for example, trying to establish themselves as reliable, as entities that people should trust to get more business. And this applies at the cluster level, too. These clusters are how we scale horizontally. You can have multiple clusters. Let's say a cluster has 10 nodes, which is a good number. You could have many such clusters. So, to be permissionless, it's like saying anyone can spin up a cluster. You and your 10 friends could start a cluster, and then people can connect to it.
The fear we have with going fully permissionless is that it could expose our users to a lot of bad actors pretty quickly. We plan to add nodes carefully, considering their reputation and other factors I mentioned earlier, and gradually move towards a more permissionless system.
Nillion intro resources:
Co-founder and CMO Andrew Yeoh, recently spoke to Harvard blockchain society:
And that’s your alpha.
Not financial or tax advice. This newsletter is strictly educational and is not investment advice or a solicitation to buy or sell any assets or to make any financial decisions. Crypto currencies are very risky assets and you can lose all of your money. Do your own research.