Skip links

Chainstack X Spaces AMA with PowerPool

This is a transcript of a recorded joint Community Call AMA session with Chainstack and PowerPool discussing the potential synergy between the two companies and answering questions from the communities.

Speakers

Chainstack

Pulkit Sachdeva

Product Marketing

Davide Zambiasi

Developer Relations

PowerPool

Gordon Gekko

Strategy

Vasily Sumanov

Research

Recap

Gordon: Hello! A lot of people are still going to be joining us, so I will just make a few introductory comments. I’m Gordon Gekko, Chief Strategist of PowerPool, and today we’re going to be welcoming Chainstack, an organization with whom I think PowerPool may have quite a few interesting potential synergies, I’m looking forward to this conversation to learn more about Chainstack. I’m sure the Chainstack Community is going to be interested in PowerPool, and in particular, the PowerAgent Automation Network, which is a very fast-growing, Keeper node-based automation network.

Our PowerPool community is a multi-EVM chain service or ‘helper’ automation network, and one of a growing number of auxiliary networks that offer services to protocols running on both EVM L1 Ledgers and L2 Layers. In our case, our focus is primarily on automation services, a very fast-growing segment of the chain services market. It’s an area about which a lot of people don’t know much, and we’ll be discussing it in quite some detail.

I’m still waiting for some of the Chainstack people to join us, and when they do, they’ll give you an introduction to Chainstack and its products and services, but I think what you’ll see is that PowerPool is comprised of a lot of people running nodes and Keepers, and Chainstack is comprised of platforms to do just that across a wide number of chains. I think this overlap is something that we need to explore and get into more detail about how the two communities can find synergies, not just on public chains, but potentially also on private chains. One of the many things that makes Chainstack interesting is that they also serve private chains, and I think most of the Community only knows public chains. But the reality is TradFi has most of the off-chain assets and probably always will, and as they come on-chain, many of those asset transactions will take place on their private chains, so we’re going to need to work through the use cases.

Most of us are familiar with, and we’ll spend most of our time talking about the pure public-to-public kind of use cases, but there will be public-to-private, and there will be private-to-private, and that just requires a systematic approach to discussing, okay, what’s the value add in each of these particular cases.

This is an AMA format, so we’re hoping to get a lot of questions coming in from both communities. Ask us anything. If we don’t know the answer, we’ll certainly tell you, but I think that there’s a lot to be gained from the complementarity, if you like, of these two communities. Is there anybody else out there who’s got their speaking?

Pulkit: Hey, can you guys hear me? This is Pulkit from Chainstack. Thank you. Perfect. I see we’ve also got Davide here with us.

Gordon: I thought that we should give an introduction of Chainstack and its products and services and its community at sort of high level, and then I will do the same for PowerPool, and then we’ll get into more detail about the PowerAgent Automation Network and what it really does, why it’s valuable, why it’s new, why there isn’t anything like it. You can do the same for Chainstack, and then once we’ve fully understood the respective value propositions if you like, we can discuss how we see developing synergies between the two communities. Basically, I would suggest chain by chain.

We’re just now in the process of going live first on Gnosis Chain. The reason is it’s a very low-cost environment where we hope to pick up a lot of validators as Keepers, who will then follow us from chain to chain as we expand. We’ve been running for some time on Sepolia, getting ready for Mainnet, but it’s a lot cheaper to build up everybody’s experience and test some extreme edge cases on Gnosis Chain. Gnosis Chain already has a very large validator community, and we’re really hoping to pick up quite a few of those as Keepers. So that’s the reason why we rolled out first on Gnosis Chain before Mainnet.

Pulkit: Sounds like a good agenda. Thank you for sharing that. Just to let everyone know that both Davide and I are on the call right now along with a few other members from Chainstack. So like you said, let’s just get started.

But before I do, I do want to thank you and the entire PowerPool team for having us here. We’re really excited about what this session holds for us. We’ve definitely got a few updates that we’d like to share with both Communities.

Introducing Chainstack

Pulkit: So just to quickly introduce Chainstack, for those who haven’t been to the website or who don’t know what Chainstack is, well, we are the limitless web 3 development stack, which means we have every blockchain, every API, every tool, all the infrastructural offerings that you need to build applications for every scale in a nutshell. And of course, as we go along with this conversation, we’ll find out all the different facets and offerings that we have on the plate.

To quickly introduce myself in a couple of sentences, I’m Pulkit. I work in product marketing at Chainstack. I’ve been here for a year now, constantly in awe of how this whole web3 space is evolving on a daily basis and everything that Chainstack has been bringing to the table. And also, you know, whatever we’ve done just this year alone.

As I said earlier, we have with us our rock star Davide here, accompanying me today to provide some more in-depth technical perspective on things, so to speak. So I’ll hand him the mic for a second here to introduce himself before I move on to sharing some of the things that we’ve been working on and that we’re excited about. Davide, I believe Davide needs to be set as a Speaker.

Davide: Thank you very much for having us here today. And thank you, Pulkit, for the flattering introduction. Yes, as we mentioned, I am the senior developer advocate here at Chainstack, and I mostly take care of the documentation, and the content that we produce and distribute. And I also have a pretty big role in supporting the developers coming to use Chainstack services. We very often get many requests from developers, hey, I’m trying to do this and that, how can we do it? And we are happy to take the time to help everybody doing that. So that’s a little bit of what I do.

Pulkit: Great. Well, thank you, Davide. Okay, so just, you know, going by the outline, should we go ahead to the next topic? Okay, so just to keep this conversation going, let me just begin with some of the stuff that we at Chainstack are really excited about, some of the things that we’ve launched in the past few weeks. So just to give you a quick glimpse of what we’ve been up to at Chainstack this year, I think I want to begin by shining some light on what we call our Core Stack, which is essentially our base infrastructure layer offerings, like RPC nodes, API endpoints, what have you, right? Things that make it possible for you as a developer to begin interacting with the blockchain. And by the way, let me preface the rest of these updates by sharing our vision here at Chainstack, which is #Web3forAll, you know, just making Web3 accessible for everyone.

And you’re probably likely to see me come back to this point repeatedly throughout this conversation, so I hope you can find it in your hearts to forgive me for that repetition, but it’s really critical as you’ll see as we go along. Right. Now, one of the biggest launches for us this year has been an evolution of our elastic nodes infrastructure, or more specifically Global Elastic Nodes, which are these intelligent request routing, geographically load balanced, super reliable RPC nodes that anyone can deploy and start using in less than a second. It’s super easy to get started and get going for anyone in Web3. Now when I talk about Global Elastic Nodes, it is in contrast to our Regional Nodes traditionally, which one would deploy to have their applications serve their user bases concentrated in specific geographical regions, right? And even though it’s only been a quarter since we launched some of these Global Elastic Nodes, we’ve already seen a tremendous response from our customers and the developer community overall.

Within two to three short months, these new Global Nodes now account for more than 12% of all usage volume across all requests being routed to our elastic node infrastructure. This goes to show the confidence and the trust that the developer community has been putting in our offerings and we couldn’t be more grateful. Now, in terms of, you know, these updates on the different blockchains that you guys were also talking about earlier, Global Elastic Nodes have always supported an ever-expanding list of protocols and just this past week, we released support for the Aurora mainnet.

Moving on to the next update, we really want to be quick here in the interest of time. Another big and recent release from our Core Stack has been the launch of our multi-chain faucets, which drip up to 0.5 ETH every 24 hours, helping Web3 developers innovate and develop their apps in a safe and cost-efficient manner. These faucets already support Goerli, Sepolia, and even the newly launched Holsky testnet, in addition to BNB, ZK-Sync, and Scroll testnets. Now anyone can go to faucet.chainstack.com, get their tokens, and just get to building for Web3. So we are really making it easy for anyone to, you know, start going towards our vision of Web3 for all. And so those were a couple of updates from our Core Stack and finally, I want to share some updates from our Data Stack and I want to specifically start with the Chainstack Subgraphs.

We recently onboarded the Base protocol, adding it to, again, an ever-growing list of chains that Subgraphs support, such as Ethereum, BNB, Optimism, Arbitrum, Avalanche, and many more. Also as an extension of our Subgraphs indexing engine is our DeFi API, which allows you to query the entirety of the DeFi industry data within one simple GraphQL interface. The DeFi API is currently available — which is very exciting — as a part of a private beta and we’re already seeing a lot of traction with developers who are signing up for early access. We have a ton of other things going on under the hood of Chainstack, but in the interest of time, those are just a few quick updates from our side. So I’m going to stop right there and let the conversation take its due course and, you know, get to know more about PowerPool and, you know, all the exciting things that you have in store for us today.

About PowerPool/PowerAgent V2

Gordon: Okay, thank you very much. PowerPool is a fair-launched DAO. We have no VCs and we are a DAO composed mostly or primarily of node operators who operate as Keepers on PowerPool’s PowerAgent Automation Network. We’ve gone live first on Gnosis Chain. We will also soon be live on the Ethereum mainnet. We already have agreements and grants to go live on Neon EVM, which is a bridge to Solana.

In terms of our tech stack, we already have bundling agreements with, for example, DAppNode. So our PowerAgent node/Keeper client is already bundled in DAppNode. So a lot of the validators on, say, Gnosis Chain who are already running DAppNode will find it very easy to join the PowerAgent Network as a Keeper and start earning native tokens as automation fees.

One function of our Community is to rapidly expand the number of PowerAgent Keepers and for these Keepers to start earning as much as possible running very lightweight nodes on top of various validator nodes that they may already be running. So the specific gravity, if you like, of our DAO membership is very focused on people who want to run EVM validator nodes. And that’s one of the reasons why I think that there’s interesting potential for synergy with Chainstack.

Why should anybody use the PowerAgent Automation Network and pay gas and automation fees to our Keepers? Well, the first reason is that they’re already using a lot of automation, but their users don’t know it. A lot of the most popular DeFi protocols have been cheating. In the past, in the absence of autonomous service or ‘helper’ automation networks like PowerAgent, protocols have been using ‘home-rolled’ centralized keeper bots to perform automated tasks. Because as you know, EVM transactions can’t be initiated either by the L1 Ledger or by the L2 Scaling Layer. EVM contracts cannot execute themselves. It requires an outside automation Agent to initiate, either chronologically or event-driven based on on-chain data, or off-chain data via oracles, or both.

So early on in DeFi, the cheat was, first of all, expect everyone to do everything as manually as possible. And then when that became unwieldy, developers started building their own centralized, ‘home-rolled’ keeper bots to provide automation. PowerPool has been around automating DeFi vaults and baskets since 2020. We were involved in thinking through this whole process of saying, look, home-rolled keeper bots, they’re a single point of failure. They’re a regulatory attack surface. It doesn’t make sense. What’s needed is an autonomous, fully decentralized, fully configurable automation services network that operates the way you would expect something like that to operate. And that’s PowerAgent version 2. It’s been a long time coming.

I’d just like to run briefly through the main reasons why PowerAgent version 2 is better than everything before and adds value in a wide variety of use cases. The first thing is it’s totally open-source. We’re a fair-launched DAO with a totally open-source code base. So anybody who wants to can look at what’s going on, what tasks are involved, what the algorithms are. And there’s a lot of research and innovation behind some of these algorithms. And what the security risks are or aren’t. It’s constantly, constantly being audited.

The second thing about PowerAgent is that it’s generalized, whereas home-rolled keeper bots are normally highly specialized. That’s why people don’t reuse other people’s keeper bots. But PowerAgent is generalized. It supports all kinds of event-driven and time-driven or chronological tasks or jobs, ensuring trustworthy execution. PowerAgent Network supports all these things right out of the box, which means that if you’re a developer working on a client protocol, what we call a Job Owner, time to market is dramatically reduced because you can use all this stuff out of the box that’s been used before and audited before by lots of other people.

The next thing about PowerAgent is that it’s highly configurable by task. In other words, each task can be treated as a completely different problem and configured to be dealt with in a completely different way. It’s not one size fits all.

And the next thing about it is it’s permissionless. So anyone can become a Keeper running a node. These nodes are lightweight, hardware requirements are very low. So it’s globally accessible as a network. But at the same time, the clients, the Job Owners, can put requirements on who is in the signer set, defining which Keepers can actually do tasks for them. And those requirements are sometimes quite demanding. If the implications of a transaction not getting done on time are quite expensive, then the Job Owners can make the stipulation that only Keepers with big stakes can deal with it. On the other hand, it doesn’t matter that much if a task is done a day late or only done when gas is low or whatever, it’s configurable task by task.

The next thing is it’s totally autonomous, so you can trust that PowerAgent is operating totally independently of the protocol. That means it’s not a regulatory attack surface. That means it’s definitely not a single point of failure.

PowerAgent is also very flexible, in the sense that it is configurable by task and it’s cost-effective. The fee levels are largely a function of gas. But you can set, you can define those fee levels job by job. It’s not an inflexible fee structure. It’s very secure in the sense that we have random Keeper selection. No one can predict which Keeper is going to do which task. And this turns out to be quite an important feature, even when you’re dealing with private chains.

And finally, it’s trustworthy. All the Keepers have skin in the game. If they fail, it costs. And this is not true of lots of other approaches to automation.

So there is a very long list of reasons why the PowerAgent version 2 Automation Network is an innovation. It offers a completely new set of options to EVM-based developers. And it offers a completely new opportunity for home stakers and small node operators to earn native tokens without being locked into any one chain or any one layer. Simply by being a member of the PowerPool Community, you’re interacting with a lot of people who are just like you, doing the same things you are and having the same challenges that you face. So the PowerAgent Network composed of nodes/Keepers and the members of the DAO Community are sort of the same thing.

So with that overview of the PowerAgent Automation Network value proposition as a background, I think one of the things we should discuss in terms of synergies with Chainstack is upcoming rollouts chain by chain. I mentioned earlier that we are currently going live on Gnosis Chain. It would be interesting to know from Chainstack’s perspective, what sort of services do you offer for Gnosis Chain and how do you see your own services on Gnosis Chain developing?

Vasily: Yeah, Gordon, I think I will also add a little bit on top of your introduction. So PowerAgent really can automate anything. It means that developers can automate any on-chain action and on-chain strategy. They can create really complicated tasks that include creating new tasks, for example. So you can create a task that will create new tasks on demand based on some conditions. You can create a task that will fund itself. For example, the gas costs for operation. So basically, you can do a lot of stuff.

But what PowerPool needs or where PowerPool can collaborate with tech stack partners like Chainstack is first of all RPC services. It’s very important. If you have some Gnosis Chain RPC services, it will be very relevant to PowerPool launching on Gnosis Chain. Also, if you maybe have contacts, some dApps that are using Chainstack, on Gnosis Chain, or the Graph, or over time, anything else that needs automation, we will be happy to collaborate with these protocols and products, and we can provide some grants for building automation services on top of PowerAgent Automation Network. So if, for example, some protocol already has some home-rolled keeper bot automation under the hood and they want this automation to become more decentralized, more reliable, and more robust, we can do that for them. They just need to reach out and we will help them to connect their contracts and automate them using PowerAgent. Grants are available.

Exploring potential synergies between PowerPool and Chainstack

Vasily: So could you share, please, something about Chainstack’s activities on Gnosis Chain and also about Ethereum? PowerPool is now launching on the Gnosis Chain mainnet, but in one or two months, we will also go live on the Ethereum mainnet. And I think the RPC services will be quite important for us and also maybe some other things as well.

Davide: Cool. We offer various services on Gnosis Chain, specifically if you’re interested in the RPC services. Obviously, we have a very large distributed network for all our RPCs. That means that you can deploy nodes in the region that you want with the provider that you want, which gives you a lot of flexibility depending on where you’re serving your requests or where your customers are connecting from. So on the Gnosis-specific side, we offer both Elastic and Dedicated nodes. Elastic nodes mean it’s a very wide network of nodes deployed pretty much everywhere in the world. And so you’re going to leverage that both for full and archive nodes, we have the Debug & Trace API. So it is a very flexible offering. And then we pride ourselves on not having any rate limiting and being very flexible. So whatever you guys need to do on Gnosis Chain, you can come to us and we’ll make it happen.

I think the second interesting part is that ChainStack also offers Subgraphs as a service. We have subgraphs on Gnosis Chain, which is always a very powerful tool that you can use to aggregate data. And if you have analytics or any kind of other situation that requires getting a lot of data from the chain in a reliable and efficient way.

Vasily: Yeah, that’s really good. We will definitely connect after this AMA on the RPC topic because we really need it and our Keepers need it as well. In addition, maybe you have a list of dApps that are using Chainstack? If some of them need automation, you could maybe connect us to them and we will be happy to help them with automation. Because automation is our primary focus. We are working on onboarding a lot of new protocols to PowerAgent and we’re offering grants to developers of protocols, and offering technical support to them.

Our mission now is to make all current and planned centralized automation become decentralized. If protocols are using centralized automation, they have a single point of failure. They have a real threat to their users, to be honest, because if one day these keeper bots will be offline, they will not be able to send all those transactions on time, on conditions, or otherwise execute properly, and then the protocol gets stuck. This is a big problem, to be honest. In the past, maybe one or two years ago, people started to use centralized bridges a lot and they thought that this is OK, this is not a big security risk for users. But after the Anker situation, when the single private key was hacked and stolen and a lot of users lost their money, it was a million dollars of damage to them. And some other bridges, small ones, were also hacked in the last couple of years or so. Soon people started to realize that this is not a good solution at all, because it ends all the time in the same way. One day it is hacked and people lose money and that’s it.

So in terms of automation, there have already been some cases of problems with automation failures, and damage sustained because of these failures. But so far, automation risk is not yet a widespread narrative. But I think that we need to greatly reduce automation risks on-chain before the bull market when there will be much more TVL, many more users, and much more activity in the system. And then if something gets hacked based on a centralized keeper bot that failed, we will have a lot of problems with that. So we want to greatly reduce the automation risk by converting all protocols that use centralized bots to use decentralized automation networks like PowerAgent, and we are providing grants and full support.

So I’m just curious to ask, how many protocols are using Chainstack? I mean approximately, both on Gnosis Chain at this moment and on Ethereum mainnet?

Pulkit: Yeah, so I think I’ll just quickly take that question. If I understand correctly, what you want to know is if there are any automation opportunities on the Gnosis Chain network that you can create value for, right?

Yeah, so I think the way I would respond to that is we can certainly explore opportunities there. Probably, this is a conversation or a discussion that we can probably take offline and we can create some sort of a synergy between our two teams. And we can definitely look at the details and how we can get this thing done together. Does that sound okay?

Vasily: Yeah because our forecast is that in three years, around 40% of transactions will be handled by decentralized automation networks like PowerAgent, and some other auxiliary service networks as well, like ChainLink and Gelato.

I think you know Gnosis Pay, right? So Gnosis Pay is a fully automated cryptocurrency card running on Gnosis Chain. All of the transfers are handled by account abstraction and the Keeper network there. And we think that this is the first step into the really big narrative that not many people understand right now.

So, for example, two years ago, the Curve Wars narrative was drawing all the attention of the DeFi community. But one year before the Curve Wars, nobody understood it. Maybe only a very narrow group of people understood that all this ve-economics and token aggregation services will get real power and we’ll get real adoption and billions of dollars in valuation and usage. So regarding automation, I think this is the underappreciated narrative now. I just want to say that the lack of attention, and maybe lack of understanding from the majority of the crypto community is still noteworthy.

Currently, the dominant paradigm is that users all have their own wallets and they have their own self custody rights, right? And they sign all the transactions themselves. So they manage their assets based purely on their own actions and permission and purely on their own signing of transactions.

But in the future, it will be different. In the future, we believe, people will outsource a lot of transactions. I mean, even individual users will outsource a lot of transactions to automation services keeper networks. Just imagine you have an account abstraction account that is signed by you or by a Keeper network, and you provide some permissions to this Keeper Network. Not for all transactions, not for withdrawing all your ETH to some other wallet, of course, we’re not talking about that. But for interacting with sub-protocols, like making limit orders, making some stop losses, or making a lot of actions on-chain that previously were done manually. Now it will be fully automated and it will be based on what you want, as a user. So, for example, if ETH is above 2K, create a limit order on Uniswap, on Balancer, or on some other DEX to sell your ETH at a certain price. And if the price is below some value, buy some ETH.

But trading is only one of the use cases, one of the most adopted use cases. You can also put some money into lending protocols. You can protect your own account from liquidation. For example, if you borrow funds to farm somewhere, you can easily do a strategy that will un-stake your stables that you took as debt and close the debt in case your contract goes down, like in case of a market crash. So you have a lot of options here. We think that in the future, people will be really OK with the fact that some of the transactions can be signed by someone else on their behalf and according strictly to the conditions and to the logic that they approved previously.

Pulkit: Yeah, I think the way you explained it, I think it’s really comprehensive as far as I understood it. I think the future that you’re describing certainly is a far more convenient future. I think it is a future that enables greater mass adoption of blockchain technology in general. And I think it’s definitely something that we can see as a desirable outcome of what you guys are working on.

Gordon: I think automation is just the logical consequence of the fact that for many, many tasks, the data required is all online or available via oracles and the logic can depend on off-chain computation. So rather than having to do everything yourself, you lay out the logic of when, how, and under what circumstances you want something done, and then go to sleep, you touch grass. I mean, it simply doesn’t require as much effort and attention if you move toward automation.

Plus, there are all the other advantages from the developer’s perspective of automating, the regulatory perspective, and the scalability perspective. One of the reasons that we are trying to keep our hardware light and grow our number of PowerAgent Keeper nodes to very large numbers across large numbers of EVM chains is that this constitutes scalability for automation.

Vasily: Yep. I think that PowerPool and Chainstack can make a quite huge collaboration on all of this because Chainstack provides a lot of infrastructure tools, right? Not only the RPCs themselves but also the extra nodes, also the subgraphs, etc. So it all can be used for building automation services. So we are really very open to collaborate with Chainstack and to build on things that you’re doing. Because I think that in Web3, every project should do their business and not try to do everything, right? That’s how the economy works in general. You’re focusing on the most important topic for your protocol and you outsource all other professional services. So this is why I’m really bullish on the Chainstack and PowerPool collaboration.

Pulkit: Well, that certainly sounds like a plan and a synergy in the making. Yeah, I agree with you there. This is something that definitely requires a bit more exploration. Of course, we have our own focus, which is fast, low-latency infrastructure offerings for the Web3 community. And of course, you guys come in with your automation vision of the future. Yeah, I’m sure there’s a ton of synergy that can be created. So yeah, we’re all open doors to that.

Gordon: Well, I think the first step is to focus on what we can do together to rapidly roll out everything we can offer on Gnosis Chain. Following that on our schedule is, of course, Ethereum mainnet. We’ll bring our Sepolia testnet to an end and roll out on mainnet. I also mentioned Neon EVM. I don’t know if they are on your radar at all.

Pulkit: Well, not at the moment, but it is very interesting that you bring it up. This is certainly one of the things that we actually do very proactively, look out for new protocols to be onboarded every month. And there’s actually quite a few of them that we do quite regularly. Neon EVM has not been on our radar so far, but definitely, as I said, a very interesting choice that you bring to the conversation. So I think what I would love to learn about is what are your perspectives on this and anything that you want to share with us about Neon EVM.

Gordon: Well, Neon EVM, it’s fairly straightforward what they want to do. I mean, they want to bring EVM—compliant code and put it on Solana to avoid the gas prices and surge pricing of Ethereum in favor of Solana’s very fast and reliable fee structure. The problem with Ethereum is not only high fees, it’s unpredictable fees. And if you’re automating something and you’re paying fees to PowerAgent, to our Keepers/node runners, you like to know what your total cost is going to be. And rightly or wrongly, that’s Solana. And that’s the logic, if you like, for why Neon EVM.

Davide: Yeah, I was just looking at this Neon EVM and it is a very interesting combination. Definitely something that we should explore. Obviously, the EVM is the widest and most popular, you know, runtime on the blockchain. So it’s just kind of an obvious combination. Solana has been performing very well with the low gas fees. If you add the simplicity of the EVM, it sounds like a really good deal to me.

I wanted to ask a quick question for the PowerPool guys. I’m pretty interested in the claims that you guys make regarding off-chain computation. Can you give me a few examples of how you see this segment of the services market, some use cases, or how you would use automation with the off-chain computation?

Vasily: Yep, I think I will provide a couple of them. So just imagine that you need to make a stablecoin funds allocation strategy on-chain. So you select 35 lending sites or 35 places where you can deposit your USDT and get some yield. And you want to make your capital distribution and redistribution to these 35 options in an efficient way. Assume, for example, you have 10 million dollars and you want to distribute it. You’re building an optimized yield-generating product for that. So to make this happen, you need to solve the matrix of linear equations, 35 on 35 sides. So it’s quite a big matrix and you cannot solve it on-chain. You’ll spend tons of gas, it will be very expensive. It just makes no sense to solve it on-chain.

But what you can do with off-chain computations, you can take all the on-chain data. I mean the amount of the rewards in every place that you want to deposit your capital. And you can calculate the optimized allocation of these funds to these 35 places. So it works easily. So you find out the most profitable place and you start to allocate money there. I mean, I’m talking about the calculations right now, not about the real allocations. So you start to calculate, okay, if I will allocate the most profitable place when this most profitable place will soon become the same profitability as the second in the list, etc. So as I said before, it requires solving the 35 on 35 matrix of linear equations. So what you can do, you can solve it off-chain, make a proof with ZK resolvers, that it’s calculated correctly, and distribute your funds. So you can avoid on-chain calculations for this huge mathematical task and you can calculate it off-chain and just use the results. And allocate the capital based on these results to these 35 places and make it efficient. Not spending gas, not spending anything on-chain. So this is the first simple example.

Another example would be if you need to calculate a complicated portfolio that has a lot of on-chain data that needs to be auto-harvested and processed in some way. Let’s say you have funds in 50 protocols, you need to calculate the whole portfolio value at this moment and in the past. Say you are an asset manager and you need to provide some performance statistics to the client. You can also do this off-chain, calculate it and provide just the results, make some on-chain proof, and that’s it. So any computations that are too heavy to be solved on-chain, you can outsource off-chain.

And it seems like this has not so many use cases, but that’s not true. There are a lot of use cases for off-chain computations, including, for example, the computational portfolio tracking across different chains. On-chain or multi-chain, which everyone knows is the future, it’s quite costly to do.

Davide: That’s really cool. I kind of agree with this part. Everything that’s going on on-chain is very difficult to manage. And it’s definitely not as user-friendly. So we’ll have to add some more ways to get people in

Vasily: You can even calculate the fund allocations block-by-block, every block off-chain. Because it’s not costing you any money, not costing you any gas. And the linear equation system can be easily solved off-chain, you don’t need a lot of computational time for that. It’s not something that requires a lot of resources to compute, but on-chain it will still be quite expensive. So you can make the calculation every block and reallocate your resources if the optimal allocation changes over time. For example, you could also choose to reallocate your resources every 12 hours or every 24 hours. It depends on the amount of gas you need to spend for this reallocation because it mostly includes the transfers of ERC20 tokens. And the ERC20 transfer is quite expensive. If it’s 35 places and you need to reallocate funds between them, it’s 35 or sometimes fewer transfers of ERC20. It’s also quite expensive. Manual DeFi is basically dead, in favor of automated asset manager UIs.

Gordon: We’re starting to get a few questions to come in. One of them is a question for Chainstack. It says:

Q1. How does Chainstack ensure enterprise-grade infrastructure reliability and support for users of Chainstack subgraphs?

Pulkit: That’s a great question. Dave, is it okay if I just begin answering the question and then maybe if it requires, you can take a technical deep dive into it later?

Right. So as I said, even though I’m not in the best position to go into great technical detail, Dave is, but I think I’d love to just try and summarize it briefly. I think one of the ways we achieve this enterprise-grade reliability with Chainstack Subgraphs is, well, we connect our graph indexers with a Global Elastic Node endpoint to ensure that if one node starts lagging or starts facing performance issues, then all requests, they’re automatically routed to the immediate next best-performing node.

We also have proactive monitoring and alerting on each of the graphs’ indexers to catch every event. When the graph node starts to lose the chain-head of the blockchain node or the VM’s CPU or RAM usage is nearing its maximum and so on, there are a lot of metrics that we keep an eye on. And of course, when I say we, what I essentially mean is we have a dedicated 24–7 infrastructure team that keeps an eye on all parts of the graph and Chainstack’s infrastructure. And that is something that I’m really proud of. Of course, we are all proud of our service at Chainstack. So that was the brief summary of it.

Is there anything that you would like to add here, Davide? Or did I kind of get it in a nutshell?

Davide: No, I think you gave a pretty good overview here. Obviously, we’re focusing on reliability so that the developers can come in, spend two minutes to get their endpoints, and then never talk to us again. At least on the reliability side, you know. And then, as I mentioned earlier, if you have some special requests, we’re for sure always open to that.

Pulkit: Yeah, it’s a bit of an internal joke, isn’t it? Just set it and forget it. I think that’s what we’re looking for.

Q2. How does Chainstack contribute to the growth and adoption of Web3? What notable partnerships or projects are leveraging your services?

Gordon: I guess, in other words, who are the stars of your client list?

Pulkit: That’s a really good question. I think it’s very valuable and pertinent because everything that we do contributes towards the growth and adoption of Web3 technologies. Even though I believe if I start answering this question, the response might get a little longish, let me again try and summarize it. Instead of going into how we contribute to growth and adoption, I think first I’ll tackle the question of which notable partnerships or projects are leveraging our services.

In terms of notable partnerships, I’d say we work with many of these really popular logos, such as the likes of TrustWallet, Chainlink, Certik, 1inch, and so many more, empowering their operations.

To give you an example, with 1Inch specifically, we were able to collaborate very closely in creating a seamless DeFi experience for their users worldwide. Whether it was migrating 1Inch’s data accessibility and operations to Chainstack Subgraphs after the sunset of The Graph, or creating and customizing solutions that are tailored to the needs of 1Inch, to provide 24–7 support. And while I’m talking about which partnerships or projects are leveraging our services, just to give you some numbers along with this example of 1Inch, they currently run over 28 subgraphs on our Dedicated Indexer, powering over $286 billion in transaction volume, all the while experiencing close to a 6x node performance uplift ever since they migrated from The Graph to Chainstack Subgraphs. So that was one example of a partnership or a project that has used our services quite remarkably in the past few months.

Now, if I go back to how Chainstack contributes to the growth and adoption of Web3 technologies, I think I’ll have to harken back to the whole #Web3forAll narrative. Just being able to provide high-quality, fast, low-latency, reliable, secure infrastructure offerings, which in turn, by the way, we can talk about how these infrastructure offerings are fast and low-latency and high-quality, but ultimately what it translates to in terms of business outcomes is it drives better user experiences, better customer experiences on the business side of things, and just better business outcomes overall.

What we’ve also done as a part of contributing to the growth and adoption of Web3 is we’ve designed essentially the simplest pricing in all of Web3. I know no one really likes to talk about pricing and packaging, but I think it’s one of the most critical points, especially in today’s bear market where everyone is striving to be responsible and really optimize their spending. So we’ve been able to design the simplest pricing in all of Web3 so developers can focus on building and not worry about hidden charges or unpredictable billing and expenses. And of course, if someone wants to double-click on how we’ve designed the simplest pricing in all of Web3, I’m more than happy to go into much more detail.

But I think right now I’ll move on to the next point. I think what I have off the top of my head is we also offer grants. We offer a substantial amount of grants as rewards in hackathons. Now I know it’s not the most groundbreaking thing, but it is always very important and helpful for developers to test their hypotheses and find product market fit. And then, as I mentioned earlier, we currently have one of the most comprehensive faucets out there today, supporting six testnets and dripping up to 0.5 ETH every 24 hours so developers can build apps safely and in a cost-efficient manner.

In addition, we are very proactive when it comes to creating tools for the developer community, such as the official Chainstack-Chat GPT integration and the EVM Swiss Army Knife toolbox, with all the essential tools packed into one that a developer needs to speed up their process. By the way, both of these, the official Chainstack-Chat GPT integration and the EVM Swiss Army Knife toolbox are to the credit of the one and only Mr. Davide on this call here. He’s the creator of these tools. And of course, before I give the mic to Dave to talk about these tools, there is, of course, the constant promise to keep building products and features, which are demanded by developers to build solutions for the users.

For example, we talked about making data indexing super accessible with things like Chainstack Subgraphs. We’re also extending those to our DeFi API, along with continuously adding and onboarding more and more blockchain protocols onto our infrastructure. Davide, do you want to take a minute to talk about these two tools that I mentioned?

Davide: Yeah, sure. The Chat GPT plugin has been something that we launched a couple of months ago. And thanks, Paul, for the flattery. But I have to admit, I didn’t work on those 100% by myself. It was a collaboration with one of our former developers who left not so long ago. But yeah, so the Chat GPT plugin is really just a way to tie Chainstack infrastructure and blockchain into AI. We strongly feel that these generative AI tools are going to be very important overall on the way forward. So that’s partially why we launched the GPT plugin. It’s an interesting way to play with the infrastructure and AI in general. You can find it in the OpenAI store. It’s verified. You just type Chainstack in the store, and then you find it. Chat GPT is very smart in understanding what the user wants, and it’s very good at tying it into what the tool can do. So you can just ask, hey, what can the Chat GPT plugin do? And it can explain to you basically what you have to do. You can pull various on-chain data. You can ask, hey, what’s the balance of this address for this chain?

I believe right now on the plugin, we support a few different chains out of the box. I believe Gnosis Chain is already one of those. So you can actually get data from Gnosis Chain directly from Chat GPT. Obviously, it has some limitations. Chat GPT itself has some limitations on the length of the responses. So if you ask for stuff like, hey, what’s the transaction details for every transaction in this block, then it’s probably going to give you an error because it’s just too much stuff. So that was just more of a fun tool to throw out there and see how it’s accepted by the community.

The EVM Swiss Army Knife, as we call it, is an interesting tool. It’s a web app where we aggregate a bunch of different tools. For example, we have all the various conversions. You just have calldata generation or event string generation. We also have added an API where you can just add an address, like a contract, or a small contract address, and it’s going to pull the code of the API if it’s verified. Right now it’s all on Ethereum because it’s using Etherscan. So those are all efforts to basically allow developers to have an easier path so you don’t have to go to 10 different places to do the things that you need to do.

Gordon: Okay, thanks very much. Vasily, do you have any other things you wanted to ask about Chainstack?

Vasily: I think I got everything and I think it’s a good opportunity for collaboration. We will talk about immediate topics, such as decentralized RPCs that we use right now and some other things.

Q3. Does Chainstack provide access to BNB full nodes?

Gordon: Well, great! Another EVM chain that’s on our roadmap is BNB Chain. We PowerPool have in the past launched automated multi-token baskets on BNB Chain. The question I wanted to ask Chainstack is that because BNB Chain nodes are very heavy validator nodes, does anyone use Chainstack for BNB validators?

Davide: Yeah, actually BNB is our most successful chain overall. I believe most of the transactions and most of the traffic and requests currently come on BNB. Definitely. So we definitely see that one as one of the good ones.

Gordon: Okay, because we really aren’t sure how we will recruit lots of lightweight Keeper nodes on BNB because we didn’t even know how many node runners there are, and whether any of them can or would just run lightweight hardware. I’d be interested in your ideas. I mean Gnosis Chain is easy. There are tons of potential node runners out there. They’re all running lightweight hardware. They’re mostly integrated with DAppNode. That’s one of the reasons we went to Gnosis Chain first. But, for BNB Chain, we’re not 100% clear what the PowerAgent node/Keeper recruitment strategy is gonna be.

Davide: Yeah, the BNB Chain nodes are definitely heavyweight. We run full archives. The archive nodes are huge, and require a lot of hardware and a lot of hard drive space. We obviously run full archive nodes. We only run Aragon, the Aragon client, which makes it a little bit better. But it’s still, I believe, quite heavy. Let me check the latest numbers. But I believe it’s something like for a full archive node, you need like 15 terabytes of hard drive or something. Pretty crazy.

Gordon: So how can we find people that just wanna run a little lightweight PowerAgent Keeper node on BNB Chain?

Davide: That is an interesting question. Let me see. I can find the numbers now, but it was something like that. Yeah, so for that, I guess, yeah, I don’t know. I don’t have a good answer to that. That’s a good one.

Gordon: Well, I guess keep your eyes open and think about it for us, because you’re obviously talking to a lot of heavyweight BNB nodes. Obviously, they can add us without a thought, but we like to be highly, highly decentralized. We’d like to have many more Keepers than they have validators on BNB.

Davide: Yeah, that is definitely the goal that you wanna have. Decentralization is definitely what you wanna have.

Gordon: Okay. Well, then just in general, I think that adding our Keeper nodes into your packages and just making it really easy for people to try joining our automation network is a win-win. As I said before, we’re already integrated into DAppNode, but you have a different sort of client base from DAppNode, and we need as many Keepers as we can get on every EVM chain we go to.

Pulkit: Yeah, that certainly sounds like an opportunity. So as I said before, let’s get the two teams talking, and let’s see what the avenue has in store for us.

PowerAgent’s potential launch on private chains via Chainstack

Gordon: Okay, and then with that as an introduction, I’d like to talk a little while about private chains. I know that you run Corda nodes, for example. I’m not sure about Quorum, but Corda, you definitely run. And what’s interesting to us, we’re permissionless. Anyone who wants to can run a PowerAgent node. And in theory, anybody who’s running a node on a private chain could also choose, if they like, to run a PowerAgent node.

Private chain tasks will not all be manual. In fact, most tasks by number likely will NOT be manual. The private <> public chain automation issue is: how would the Job Owners, the people who own the tasks, decide whether they want a permission-only Keeper set on a private chain or use only the PowerAgent Keepers on a public chain or only Keepers on both?

But the first problem that we have is we just want people on private chains to have a go at running our PowerAgent Keeper nodes. Nothing says PowerAgent can’t run on purely private EVM chains. Even purely private-to-private transactions on the very most private chains still have a problem, which is that they really can’t trust anybody else on their chain. They may know exactly who each other are. They may all be fully regulated, big-time TradFi organizations, but that doesn’t mean they automatically trust each other at the transaction level. They’re each others’ counterparties. They’re counterparties to each other’s trades. So it isn’t obvious how they manage trust, how they manage automation, how they manage random allocation of tasks, for example. But PowerAgent does all that out of the box.

Davide: Yeah, so the private blockchains are an interesting question. I’m pretty interested in these, but it can be somewhat controversial since, as you’re saying, it’s difficult for them to trust each other to an extent. Going back to the infrastructure itself, yeah, Chainstack runs both Corda and Quorum. And so, yeah, you definitely have that option there. And talking about private, we also run Oasis Sapphire, which also offers private transactions. So that one is also an interesting concept.

Gordon: Yeah, I mean, for us, the thing is to get PowerAgent bundled so that people who are running those private chain clients can just click and add a PowerAgent Keeper node and start to think about, okay, can I earn on automation? Can I monitor my own Jobs on PowerAgent? Who’s going to include me in their signer sets?

People talk a lot about TradFi only running on private chains, but remember the whole purpose of TradFi, all of it, the whole purpose is to dump on retail. That’s what it does. So eventually, there has to be a connection between private chains and public chains. And then the issue becomes, who does the automation? Is it only the nodes on the private chains? Is it only the nodes on the public chain? Both?

We want to get into a situation where there’s no technical issue, meaning anybody who’s running Corda, Quorum or Oasis Sapphire is also running PowerAgent, because the option is already there in the tech stack, and it is permissionless. Then we can discuss, chain by chain, okay, who does what? One thing I can tell you, I have been in discussions with Corda and Quorum, they do NOT want a Chainlink monopoly on automation.

Davide: Yeah, I would say monopolies are not good anywhere. So you definitely want to have that distributed network. I think we could explore what we can do between Chainstack and PowerPool for sure. I think at the moment, we really just provide the node for the data, to get data, and to send transactions. I don’t think we do any validation from our side. We try to stay out of that. Especially when you’re talking about handling or depositing users' or customers' funds, we try to stay away from that. So we’ll definitely be able to talk and see if we can explore some synergies there because that would be definitely interesting to offer something extra on all chains, public and private.

Gordon: I think that too much of DeFi ignores the whole TradFi running on emerging private chain space. Other than Chainlink, they just assume there’s just nothing they can do, and no tokens to buy. But I think that in both our cases, it is not really true.

I think that was all from my side. Do you guys want to ask anything about PowerPool?

Pulkit: No, I think this has been a very engaging conversation. It’s definitely been very educating as well, the way you threw light on the different offerings that you have. And right now I think I’m more focused on just going back to the team and trying to find opportunities and create synergies exactly like what we started the conversation with. So yeah, let’s take a look at that, shall we?

View
Drag