Read

Edit

History

Notify

Share

Kai Zou

Kai Zou is the CEO and founder of , a distributed platform for sharing computing resources that allow users to collaborate, train deep learning models, and develop AI applications on large-scale networks. [1]

Education

Zou obtained his Bachelor's degree in Mathematics and Physics from Tsinghua University and served as a research assistant in 2010. He later earned a Master's in Mathematics and Statistics from Georgetown University and was also a teaching assistant in 2013. [1][2]

Career

After graduating from Georgetown University, Zou worked as a quantitative research specialist for the university’s senior VP of research office until March 2014. He then became a senior program analyst and consultant for Decision Information Resources (DIR), a research and data collection company, where he worked until November 2021. During this time, he founded ProtagoLabs, a company focused on researching natural language processing (NLP), logical reasoning, and AGI. In September 2021, Zou founded and, in 2023, AGI Odyssey, a dynamic non-profit committed to advancing artificial intelligence (AI), machine learning, and large language models (LLMs). [1][2][3][4]

Interviews

Background and Netmind

On May 6th, 2024, CryptoKoryo interviewed Zou about and its plans in the and space. At the start of the interview, he shared his background in AI and machine learning: [5]

“I mean, myself and most of my team are actually from AI Web Two rather than . So, I studied Mathematics and Physics at Tsinghua University in China, and after that, I went to Georgetown University for my Master’s degree in Mathematics and Statistics. I have worked as a data scientist and machine learning researcher since 2013 after graduating from Georgetown. I started to build more statistical and regression models, using random forest and neural networks at that time to solve classification and prediction problems. That was actually my very first research work, and by then, the neural networks I used only had three layers, not deep neural networks.”

“But yeah, starting in 2016, I started to learn more about deep learning and deep neural networks. I was attracted by the results from CNN for image detection; the accuracy was over 99%, which was impressive. And I realized that there has to be so much potential for improvement for NLP problems. So, I started my AI company, Proto Labs, in 2016. It was more like an AI research studio at the time, inspired by DeepMind and OpenAI. It’s my passion for AGI, so I set up a group for AI research. We actually have a lot of publications every year in top conferences, and from my in-house research work, we started to do AI consulting services to provide customized AI models to U.S. companies.”

He then went on to explain how the team created : [5]

“From 2021, we decided to become more of a product-driven company. The N Power Platform came up with a simple idea because we were training our language model. By then, I realized, oh, the GPUs are so pricey. Why not use our own cards, like I can run an AI model on my desktop with a gaming setup? And then we came up with this idea: why not combine more low-grade or low GPU RAM cards together to train a large AI model? So, we started our research on the async training structure, trying to combine the small cards together in a network. Then we started to design this N Power product to encourage more people to contribute to their idle machines, and together we can have a potentially large distributed computer network. So, yeah, I relocated to London in 2021 and set up .”

When asked to share his thoughts on the intersection of and AI, Zou responded: [5]

“That's a good question. So, one thing is decentralized computing power, and I guess another thing is the full decentralization of AI. For large companies, the language model is like a black box for everyone. For researchers, for everyone, or for startups who rely on their API, they have no control. After one update, the performance could be bad, and we don't know what happened. So, there needs to be a more transparent and decentralized model that can be accessed by everyone, where everyone knows how it works, and even everyone could vote for how it's going to be for the next version. I think that's another thing that could be the future of and AI. Also, in the future, once we have an AI agent society and AGI is there, I feel like there needs to be a cryptocurrency for transactions between agents, humans, and AI to facilitate the economy.”

He also explained the tokenomics revamp: [5]

“People can see that I'm not from . Initially, I designed the token for 100 years, which is why I truly believe our project will last for over 100 years. By designing this, I aim to decentralize everything in the long run, making the whole platform open source and accessible to everyone and building a whole ecosystem community so developers can come and build the platform together. But then, after we launched two or three months ago, we suddenly became a company and started to learn all the terms and . That was actually the first time I heard about the FV, and I realized it probably doesn't make sense for it to last 100 years. Of course, I want to make it more sustainable for the miners, but for the community fund, the team, or staking, it probably doesn't need to be 100 years. So, I adjusted the new economic model to deflate everything from 100 years to 10 years. After 10 years, the will still have tokens to from all the payments or the community fund, so it will still be sustainable, and the other proportions don't need to be 100 years.”

At the end of the interview, Zou shared his thoughts on AI and Academia: [5]

“I think for the model training part or for the large language model part, I see more teams or professors wanting to build decentralized large language models. I actually see teams coming out and saying, oh, we want to build this language model owned by everyone and accessible by everyone, where everyone can contribute data into it. And we are actually doing the same thing. The async training structure is a fundamental algorithm we use to build our large language model. My idea for this is to create an auto-growth model with a self-learning feature. So, once there are 50 GPUs, the model will run on 50 GPUs. When we get more GPUs, it will distribute some of the tasks to the new GPUs and train the model. After that, it can merge together, and the model will grow as the computing network grows. This is one of our research projects, and we see it as a perfect fit for the N Power structure. In the long run, we hope it can grow into a really large language model, as good as GPT-4, and be accessible to everyone.”

See something wrong? Report to us.

Kai Zou

Commit Info

Edited By

Edited On

June 5, 2024

Feedback

Average Rating

Based on over 2 ratings

How was your experience?

Give this wiki a quick rating to let us know!

Twitter Timeline

Loading...

Loading

Media

REFERENCES

Join the IQ Brainlist

Sign up for the IQ Brainlist to get early access to editing on the IQ.wiki site!

Join Now

Subscribe to our newsletter

The IQ Ecosystem Report will keep you updated on everything IQ.

Subscribe

IQ.wiki

IQ.wiki's vision is to bring blockchain knowledge to the world and knowledge onto the blockchain. A part of Brainfund group

https://twitter.com/IQWIKIhttps://www.reddit.com/r/Everipedia/https://t.me/everipediahttps://www.instagram.com/iqwiki_/https://github.com/EveripediaNetworkhttps://discord.gg/x9EWvTcPXthttps://www.facebook.com/iqdotwiki

IQ

What's IQ?StakingBonds

Company

About usCareersWe're hiringBrandingIQ GPTIQ Dashboard

© 2024 IQ.wiki Powered By BrainDAO & IQ