GUIDEHow to Build AI Benchmarks that Evolve with your Model
Community

Creating a Transparent, Trustworthy Data Platform for AI with Blockchain ft Unbiased AI CEO Sukesh Tedla

An interview with Unbiased AI CEO Sukesh Tedla about how to create a transparent, trustworthy data platform for AI with blockchain technology.

Transcript

Michael
We’re a couple of minutes past the hour, so let’s get started. Yes, I’m sporting my Label Studio swag today—this is some of the exclusive community swag we’ve started rolling out. There’s even a little opossum on the back. I’ll talk more about that later.

We’ve got a lot to cover today, including a big announcement: the launch of our new Label Studio Champions Program. But first, I’m excited to welcome our guest: Sukesh Kumar Tedla, CEO of Unbiased ML. We’re going to have a great conversation about ethical datasets, blockchain, and how those intersect with MLOps. If you’re curious about crypto, decentralized systems, or how to build more transparent AI, you’re in the right place.

Before we dive in, a quick plug for the community. I’m Michael Ludden, and I run the open source community here at Heartex. If you haven’t already, please join our Label Studio Slack—we’re taking questions live in the #webinars channel.

You can also view and RSVP for upcoming webinars on our Webinars page. We’ve got some great sessions lined up for the new year, including one with Activeloop that you can register for now. Before year’s end, we’ll be previewing new features like video annotation and custom front ends. You can read more on our GitHub roadmap.

All past webinars are available to watch on YouTube. And if you’re watching this live—or later—please subscribe, leave a like, or drop a comment. It helps us and the channel grow.

Now, let’s talk about the Label Studio Champions Program. We’ve been working hard behind the scenes to get this ready. This is the first iteration, and we’ll keep expanding it, but it’s live right now at labelstud.io/community/champions.

You can earn points for activities like joining Slack, subscribing to the newsletter, and contributing to the community. Redeem those points for swag and prizes. There’s a leaderboard, and we’ve seeded it with our early community volunteers. For now, just DM me on Slack with what you’ve done—we’ll track it manually to get things going.

We’ll soon add ways to request new activities and rewards directly. Think of this as your early-access invite to help shape the program.

Lastly, a quick link to Sukesh’s company: unbiased.in. And with that, Sukesh—welcome, and over to you.

Sukesh
Thanks, Michael, and thanks for having me. It’s great to be here and to see the amazing work the Label Studio team and community are doing. Many researchers and developers in AI are already benefiting from these tools, so kudos to you all.

Let me start by sharing a bit about Unbiased. We’re a data marketplace and services platform that uses blockchain technology to enable ethical AI development.

I’m based in Sweden, where Unbiased is headquartered, but our team is decentralized and global. In addition to my role as CEO of Unbiased, I chair the Swedish Blockchain Association and work on other blockchain and crypto initiatives.

Before diving into how Unbiased works, I want to explain why we use blockchain in the first place. Blockchain is often misunderstood, so I like to start with a visual metaphor about how current socioeconomic systems often favor the top 1%. When economic issues arise, it's the majority who bear the burden. Blockchain was born out of a desire to change that.

In 2008, following the global financial crisis, the Bitcoin whitepaper was released by Satoshi Nakamoto. That sparked a movement toward decentralization and open source technology.

So, what is blockchain? It’s a distributed, decentralized network of nodes that all share the same data. Each node holds a copy of the transactions, and any changes must be verified by the entire network—this builds trust and transparency.

Each block of data is connected to the previous one using cryptographic hashes, forming a secure, immutable chain. This makes it extremely difficult to tamper with the data. Blockchain also supports smart contracts, which allow automated actions—like exchanging a dataset or triggering payments—without intermediaries.

There are different types of blockchains: public, private, permissioned. At Unbiased, we use public blockchains to ensure full transparency.

So why did we start Unbiased?

Originally, we set out to combat fake news and misinformation by building AI models that could detect and classify news articles. But one key question kept coming up: why should anyone trust the suggestions our models made?

That question led us to blockchain. We thought, what if we could make the entire model development process transparent and auditable? That’s how Unbiased evolved into a platform for building ethical AI, with three core focus areas:

Access to quality, trustworthy datasets

Ethical concerns like bias, fairness, and privacy

Data provenance and regulatory compliance

Most high-quality datasets today are locked away in cloud storage or considered proprietary. They often go unused after a model is trained. Meanwhile, startups and researchers struggle to access data due to legal and privacy concerns. Our goal is to enable trustworthy, peer-to-peer data exchanges.

On the ethical side, we’ve all seen examples of bias—recruiting algorithms that favor men, face recognition tools that underperform for people of color, and ImageNet biases that carry through to downstream CV models. Without diverse, transparent datasets, these issues persist.

Regulatory frameworks like the EU AI Act and Data Governance Act are emerging to address these problems. They introduce risk-based classifications and requirements for transparency, especially for high-risk AI applications.

Unbiased is designed to help developers and organizations meet those standards.

Our platform has three core components:

Data services: We offer collection and annotation services through crowdsourcing and managed workforces. We actually use Label Studio for the front end—thanks again to the team and community!

Bias and fairness evaluation tools: Built into our services for continuous quality checks.

Blockchain-based AI audits and data marketplace: Every event—data collection, labeling, transfer—is recorded on-chain. That makes the entire pipeline transparent and auditable.

Let me show you how it works.

Imagine a company needs a dataset of bird images. They submit a request on our platform. From that point forward, every action—project creation, task assignment, labeling, final dataset delivery—is recorded on the blockchain.

The result is a certificate that captures the full audit history. You can customize the metadata—for example, geographic distribution, age ranges, even religious preference if relevant and voluntarily provided. Annotators can choose which projects to participate in and what data to share, and we have incentive and trust systems in place to prevent spam or manipulation.

The certificate can be shared with customers, partners, or regulators to prove how the data was sourced and processed.

Let’s talk about our data marketplace.

Normally, if Company A has a valuable dataset and Company B wants to train a model, they can’t collaborate because of privacy, IP, or compliance barriers. That limits innovation—especially for smaller players who don’t have massive datasets.

We use OpenMined technology and remote execution to solve this. Instead of sharing the dataset, Company B sends a model training request. The data stays in Company A’s environment. If the request is approved, training runs locally, and the model—not the data—is returned.

This allows for secure, auditable AI development without transferring sensitive data. And because all interactions are recorded on blockchain, companies can prove compliance and even monetize data that would otherwise sit idle.

Transparency in AI starts from the ground up. We believe the best way to build trust isn’t just with regulation, but by shifting the mindset of developers from the very first line of code. When transparency, fairness, and responsibility are part of your core process, trust follows naturally.

Thank you again for the opportunity to share what we’re building at Unbiased. I’m happy to take any questions.

Michael
That was awesome—thank you. If anyone has questions, now’s a great time to drop them in Slack.

One takeaway for me: I clearly need to go watch Coded Bias on Netflix. I hadn’t seen that before. Also, I loved seeing the audit certificate—can you go back to that slide?

Can you talk a bit more about how granular this metadata gets? For instance, can someone collect things like religious preference?

Sukesh
Yes. We don’t expose personal user data directly, but the metadata is customizable. If a project requires certain attributes—like ethnicity or religious preference—users can voluntarily provide that information. Participation is optional and incentivized, and our system includes measures to prevent abuse, like point penalties for spammers.

Michael
Got it. So annotators can see what’s being asked and choose to opt in. That makes sense.

One more question—what happens when ethically sourced data leads to conclusions that those in power don’t like? Transparency is great, but what if it clashes with a government or corporation’s interests?

Sukesh
That’s a great question. Our approach is bottom-up—we focus on educating and empowering developers. Even if there are regulations, the behavior of the people building these systems matters most. Transparency should be a shared value, not just a checkbox. And that starts with awareness and responsibility.

Michael
That’s a powerful way to wrap up. Want to throw your final slide back up for folks to scan the QR code?

Thanks again, Sukesh. Really fascinating presentation. And thanks to everyone who joined us today. See you next time!

Related Content