Interview with Paradigm – with Charlie Noyes, Georgios Konstantopoulos, and Hasu

Hasu: Welcome to Uncommon Core, where we
explore the big ideas in crypto from first principles. The show is hosted by
Su Zhu, the CEO and Chief Investment Officer of Three Arrows Capital. And me
Hasu, a crypto researcher and writer. Su, my regular co host, is sitting out
today, and I welcome instead on the show Charlie Noyes and Georgios
Konstantopoulos, of Paradigm, one of the largest investment funds in crypto. We
covered a lot of ground in this episode, starting with how Paradigm can so
consistently identify and back the category defining protocols and
companies in crypto. Next, we talked about MEV and how it can be mitigated,
including some actionable advice or you can avoid getting sandwich-attack today.
Then we moved on to Uniswap V3, discussing its new features and
remaining challenges. We also discussed Paradigm's thesis for Cosmos, and why
each blockchain should adopt the IBC protocol. Finally, we had a really
interesting and spirited conversation about the value proposition of Bitcoin,
and whether it is still intact at a time where Bitcoin dominance in the market is
at an all time low.

This led us to compare our personal wish lists for
Bitcoin improvement proposals, which was a lot of fun. Enjoy. Please introduce yourselves, maybe
starting with you, Charlie. Charlie Noyes: I'm Charlie. I'm on the
investment team here at Paradigm and have been for about the last three years
now. Before that, I worked at another crypto focused investment fund called
Pantera. And before that, I was at MIT. Georgios Konstantopoulos: Hey, everyone,
I'm Georgios, I do research at Paradigm where I also work very closely with our
portfolio companies. Before that, I worked with many crypto companies doing
their architecture code, code reviews, so anything on the technical side, and
before that I did some security. Hasu: Paradigm is, I would say, the fund
in crypto that more than any other fund has the image of being able to predict
the future. For example, you have been early to the AMM space, the lending
space, the stable coin space.

And now layer two and MEV. What is your secret? Charlie Noyes: So I don't think there's
any secret recipe. I do think that Paradigm is unique in affording everyone
here the opportunity to go as deep as they want in any rabbit hole that they
want to jump down. So I think maybe the best example would be like Dan Robinson
met Hayden really early on, just as Uniswap was getting the Ethereum
Foundation grant in like early 2017, late 2018. And basically fell in love
with xy=k. And, you know, over the last three years, I think Dan did basically
did everything from helping to formalize, you know, the initial
protocol spec, do a lot of analysis on it. And, and there were some
opportunities for other folks on the team to collaborate with him on that,
like the fee paper I published with Tarun Chitra at Gauntlet and Dave
White's solution to the problem we posed, and then finally got culminating
in in Dan's Uniswap's v3 contribution, you know, basically designing that
protocol. And I think that represented that was, you know, really like the
culmination of three years spent down that rabbit hole, which is pretty
incredible.

Also on the team, like I spent a good amount of time on on MEV
myself. And, you know, we were able to translate that into Flashbots. Dave
White, who we brought on for after he solved that Uniswap fee problem,
recently, has been spending a lot of time on perpetual products and funding
rate based products. Published, like a pretty fantastic overview, perpetual
futures that actually clarified for me some of the things I wildly
misunderstood about how they worked. And and yeah, so just generally, I guess,
like, you have to spend time down rabbit holes.

And I think that makes Paradigm
pretty unique. Hasu: So what are some rabbit holes that
you're going down right now that haven't led to any investments yet? Charlie Noyes: So Well, I guess one
really interesting thing that came out recently was Dave White just published,
basically a new financial product called Everlasting Options based on funding
rates, and to reroll options, rather than needing to exercise them at
discrete points in time. It was interesting for a couple of reasons.
First, you know, a deep rabbit hole that he came out of with, potentially an
entirely novel financial product, and drawing on some of the concepts that we
as a firm have spent a lot of time thinking about, like in Maker's case,
you know, the funding rate, the interplay between the DAI stability fee
and the DAI savings rate is, is, is an argument that Dan and I have had many
times.

Hasu: It's also one of the first
arguments that we ever had, like the argument about negative interest rates,
one of my absolute contact points with Paradigm. Charlie Noyes: I forgot, yeah, you're
actually there for that. Yeah, so that was super fun. And then, you know, after
last summer, we made an investment in, in a protocol called Reflexer, which has
sort of an automated control theory based interest rate component. And I
think that was sort of a prepared mind, you know, having spent a lot of time
thinking about stability fee in the context of Maker and what could make
sense there. And now with Dave's Everlasting Option proposal, perhaps
there will be the opportunity to draw on sort of all these various different
concepts, you know, in areas that folks on the team have spent time rabbit
holing down over the last three years.

Hasu: And I would say you're probably
also one of the most hands on companies. So can you give an example of how you
would work with a with a portfolio company? Charlie Noyes: We help a lot of
portfolio companies with mechanism design, Georgios helps with writing
code. I do not. Sam is like an audit God. So I think we are pretty hands on.
It kind of depends on the needs of the project. Like, we have everything in our
portfolio from super early stage, and like kind of far out there very crypto
native things to like, what you might call traditional companies. And, you
know, I think we just try to help out however we can. Georgios Konstantopoulos: Yeah. So in
Optimism's case, what happened was that there was lots of organizational
complexity due to having working on multiple code repositories.

So I
undertook like a project where we could combine all of these repositories into a
mono repo, and automate completely the versioning and deployments of Optimism
systems. And this kind of lets us minimize errors that occur from manual
processes, where you know, some developer runs a command locally. And
this was an example for example, which the company was kind of resource
constrained at the time. So many things happening.

Everybody wants to build
something on Optimism. And basically, there was nobody available in their team
that could do it. And I said, Okay, why? Let me do it. I'll work with you for
2-3-4 weeks, however long needed to get the job done. Hasu: Did that produce any like
generalizable learnings or processes that could be applied to other portfolio
companies? Georgios Konstantopoulos: Definitely. So
throughout that the whole automation and kind of organizational aspect of complex
code bases like Optimism, definitely goes over to towards, you know, making
sure that these mistakes are not repeated in other cases. Hasu: So to take a turn here and dive a
bit into the future of crypto because I feel like that's sort of what what our
listeners would most like to hear from you sort of hear the future from you
before it happens. And a big topic that you saw coming years ago, both of you,
is MEV. Could you maybe give a short into about what MEV is? Charlie Noyes: Yeah, MEV is miner
extractable value. And its the idea that there are profitable opportunities
to order transactions in specific ways. And alternatively, I guess you could
think of it as that certain orderings of transactions on a blockchain are more
profitable for the sequencer than others.

And this creates incentives for
I think, on Ethereum today, we mostly see arbitrage bots, or we have mostly
seen arbitrage bots competing in transaction fee auctions to be able to
get, you know, their transactions in front of things like DEX trades, front
running, and now we're starting to see miners more directly participate in
this. But I think generally, you can just think of MEV as like the profit
opportunity, you know, of some transaction ordering, and that miners /
sequencers / validators, depending on the context that you're in, whether it's
proof of work or proof of stake blockchain, or even in layer two, can
access given their power to order transactions arbitrarily.

Hasu: Yeah. And as I understand the
extraction of this MEV, that's something that's very hard to avoid, but the way
that it is extracted produces some negative externalities for the
respective blockchain. Maybe Georgios, could you walk us through what those
negative externalities are? And what options there are sort of to minimize
them? Georgios Konstantopoulos: Yeah, of
course. So the main negative externality, which is a byproduct of MEV
is that when there is many, let's say, arbitrage bots, that would go after the
same opposite arbitrage opportunity, only one of them will be able to take
it. And while everything else ends up being a reverted transaction, the
implication of that is that the blockchain ends up having lots of junk
transactions, which don't do anything. And that's redundant computation. That's
redundant storage. It's not great. So that's the first one. The second one is that due to MEV, and
again, MEV creates these priority gas auctions. The bots end up paying
exorbitant amount in gas prices for including the transactions.

And this
ends up pushing up the average gas prices on the network. So in a way,
there's a negative externality towards the user itself, the previous one was a
negative externality to the protocol. And then there's the final negative
externality of just the peer to peer layer having lots of transactions that
don't end up mattering. Hasu: What is sort of the downside of
you know, having a flooded peer to peer layer? Georgios Konstantopoulos: I mean,
ideally, ideally, you don't, your node just ends up using more bandwidth.

Charlie Noyes: Okay, yeah, to like the
debate that's happening that has been had in the Bitcoin community in the past
around like the costs of running a full node. And like the cost of
decentralization within a network, there's, I think there's some argument
that like p2p layer congestion, it does raise those resource costs. And then
depending on like the form of MEV, it kind of can get more extreme.

Like in
the past, we saw something called back running, which was basically like, if
you have a bunch of transactions with the same gas price, the one that's that
ends up getting included directly behind another is functionally random, or was
at the time based on like the default gas config. So you end up getting people
basically spamming transactions actually on chain. So rather than just competing
directly, you know, on the transaction fee as you can with like a front running
style of MEV. So the exact negative externality depends on the context. But
there are a bunch of interesting ones. Hasu: Hmm, this may be my subjective
experience, but I feel like it has also become harder to make, like low fee
paying transactions on Ethereum. Because I feel like those, then you broadcast
them, but it's no longer possible to wait for three days, and then that will
get executed.

I feel like, like when you send them to the infura node or
whatever, it gets booted from the mempool. And if your own node doesn't
rebroadcast it, then you know, it's just never gonna get executed. And would you
say that's also a negative externality of having just too many transactions
flying around relative to the capacity? Georgios Konstantopoulos: It's a UX
degradation for sure. Charlie Noyes: Yeah. And I think it
highlights, like, a lot of the infrastructure just was not really
designed with like eventuality in mind, like, put another way, like, you know,
we have a high degree of reliance on infura, like across the ecosystem, and
some other folks like them, and I don't think it's their fault in any way. But I
but like the MEV stuff has just kind of exposed a lot of like gaps in in the
architecture. And in sort of, you know, they get worse at scale.

Hasu: Yeah. And there's one project
that's actually trying to improve this infrastructure and mitigate the negative
externalities, and that is Flasbots. Georgios Konstantopoulos: Flashbots is
basically a research collective with the mission of democratizing MEV and
reducing its negative externalities, which we already talked about. Flashbots
projects, let's say, our MEV-Geth, which is a fork, or a patch, on top of the
normal Go-Ethereum client implementation, which allows users to
submit transactions to a kind of like overlay mempool, not the one where
everybody competes for gas prices, but one, which is only for MEV transactions.
And so what's happened, what's been happening with Flashbots is that they
have gotten miners on board right now they have over 85% of the hash rate on
boarded.

And this means that traders are able to submit transactions to miners
and compete with each other without having to at the same time kind of
compete with normal non-MEV extracting users for inclusion. Hasu: How does the auction mechanism
work that's used in the MEV-Geth mempool? Georgios Konstantopoulos: it's a sealed
bid auction, you bid on a bundle of transactions. So and basically your bid
is the average gas price multiplied by whatever you consume. And you we allow
users to submit bundles of transaction. So you know, it's not just you
submitting one transaction, you can submit 10, or 20, or 30 transactions.
And this has an interesting implication that by being able to submit more than
one transaction, you can either do something like account abstraction on
Ethereum, where you can send a transaction with a zero gas price from
one wallet. But you can have a follow on transaction from another wallet that
pays the fee. You can also do sandwich attacks more efficiently, which is a
kind of attack that you can do on automated market makers to force traders
to get the worst execution price that they can.

Hasu: Let me interrupt you there for a
second. Is that because you can see the trade that you sandwich in the mempool.
And it's already signed. So you can take it and you can take it and put it in
your own bundle? Georgios Konstantopoulos: Exactly,
exactly. So it's basically like the bot checks the mempool, picks up the
transaction at once, creates the sandwich and serves the sandwich to
Flashbots. Charlie Noyes: It might be worth saying
like, I think the DEX sandwiching example is probably the best to like
build up from like to build up from like, quote, unquote, simplistic
extraction to like the maximum that you could possibly do and like how it
changes in different contexts.

So like, you know, we started with, I make a DEX
trade and like Georgios an arbitrage bot gets in, gets into a gas auction to
get his trade directly in front of mine and frontrun me. That's something that
you can do without being a miner or without being like a Flashbots bundle.
And we've seen that for the last year. Now ideally you would be able to get a
trade both in front of and behind the DEX transaction right, not just in
front. However, there is no like there's no way to to compete for that priority
in like the native Ethereum gas auction it just like it doesn't work that way.
At one point like I mentioned earlier, like the default geth config, randomized
transactions basically that were of the same gas price so you started to see
people competing in auctions on the front and then just spamming
transactions on the back to try to do sandwiches without Flashbots.

But with
Flashbots gives you the ability to do is that within the bundle you can just say
okay, like my for you first transaction here trade and then the background
without needing the the auction or the spam, Hasu: A crucial part or attribute of the
bundle is that it has to be either executed completely or not at all. Georgios Konstantopoulos: Yes. Hasu: Could this also be used to, let's
say set a very low slippage tolerance on a DEX trade, and then this next thread
only executes when sort of the trade can actually get filled? Because you know,
otherwise? Yes, that's the problem, right? why people don't set lower
slippage tolerance, because then their transactions fail too frequently. Georgios Konstantopoulos: Yes,
transaction on Flashbots does not like – if it does not get picked for a bundle,
it remains in the pool of bundles.

So yes, what you said, definitely is
possible. And I think that like most traders, MEV traders or not, should just
use Flashbots. Hasu: So you would have to make the
bundle conditional on the DEX trade being executed successfully. Georgios Konstantopoulos: You have would
have to add a smart contract to execute your trade… Charlie Noyes: Which I think ArcherDAO
does this, right? Georgios Konstantopoulos: Yeah, yeah, Hasu: I don't fully understand what they
do. It's not the same as Flashbots I think Georgios Konstantopoulos: In a way all
of these application layer MEV mitigations, they work by introducing a
smart contract, which kind of encodes all the rules about when the MEV
opportunity should be fired. And if that transaction, and if that condition does
not pass the transaction never gets minded. Hasu: Couldn't you just do it like this:
You have one transaction that that is like zero slippage DEX trade, and then
the fee is paid in the in the currency that you buy on the DEX. So that's how
the that you have just two transactions, no smart contract needed? Wouldn't that
also satisfy the condition? Georgios Konstantopoulos: That is
possible, but right now, the profit calculator only uses ETH.

Hasu: Okay, but when you buy ETH on a
dex, then you can do it this way. Georgios Konstantopoulos: Yes, yes. Hasu: Okay, cool. So DEX takers,
actually, regular DEX traders should also use Flasbots already. Nice. I feel
like nobody knows about this. Georgios Konstantopoulos: I think this
has to do with the fact that most of them user interfaces, right now. They
are built around the experience of sending a single transaction. Whereas
this experience that you described, the gasless one, involves two transactions.
So you'll need some sort of service that detects the transaction gets submitted,
and then lets you do it.

This is like literally some infrastructure that needs
to be built. Hasu: Okay. So if I want, I trade a lot
on DEXs, if I want to do this, like later today, what, what do I have to do?
Is it too difficult right now? Charlie Noyes: Call Georgios? Georgios Konstantopoulos: I think, yeah.
If somebody likes this idea and wants to build it, yeah, we can talk. Hasu: Okay. Charlie Noyes: So do your original
question Hasu. On, on, on, why is MEV inevitable? So one other interesting
anecdote, you know, with the slippage tolerance, stuff that you can set on
Uniswap trades natively, or on DEX trades natively, ignoring Flashbots, in
the last couple months, we actually started to see arbitrageurs
intentionally causing them to fail and revert IE saying, well, this person sent
a 1% slippage limit, and it only cost me like a few dollars to you know, force
their trade to fail. And I expect that if I do this, they're then going to come
back and set a higher a higher slippage limit, and then I'll be able to profit
off of it.

So like, that's a pretty that's that's a that's a very
interesting situation. It's not an instantaneous profit. It's, it's, it's,
I don't know, I think I think it speaks to kind of like the nature of MEV. The
reason to me that it feels inevitable is that like, there is sort of an unknown
upper bound on like, how much potential profit is available from doing this. And
then there's like pressure that builds up over time. As people get better at
like the market gets larger and larger, arbitrageurs get better and better at
extracting this, our understanding of MEV improves, and it becomes clearer
what you could do. So like with the back running example, again, there was like
the geth, config randomized transactions at the same gas price. This caused
people to spam transactions because, you know, the raise the probability that
we'll be able to backrun, some given one, then the config got changed, such
that this was no longer like the spam was no longer incentivized. And, like,
very quickly, you started to see adoption of, of alternatives.

And like, over time, I think, you know,
it's sort of like, there is some, hopefully stable equilibrium at which,
like, the network doesn't just completely degenerate to like, In
unusable state or like something ridiculous like like infinite time
bandits, and and then there's sort of like a very naive state in which you
assume that like, people in general aren't going to do this thing, or do
these kinds of things. And we're going to be honest, I think that hypothesis
like the honesty hypothesis has been basically definitively proven false at
this point. Like, we know that people, miners, arbitrageurs, whatever, you
know, are going to want to go do this to some extent.

And then I think the
question becomes, like, can we find some stable equilibrium? Or is it going to
degenerate, and there's no real option to like, stick on the current because
we're already, you know, sort of trending further and further in that
direction. Hasu: What you described, where someone,
especially a frontrunner or sandwich attacker, who has like very tight grip
on a certain trading pair, where they then make deliberately make a trade
fail, make a second trade fail, just an add on the third one to then sandwich
based on the highest slippage tolerance? I would say like, that's a multi block
sandwich-attack. Right. And we have seen very little of that so far. What do you
describe what I would say that's maybe one of the first examples. However, when
we are moving to proof of stake, after the merge Ethereum's consensus will
change in a very fundamental way, namely, who will produce a certain block
will be known in advance.

So basically, you can have, I know, the next like, I'm
next to make a block. And then it's Georgios' turn. And then it's Charlie's
turn. And if we cooperate, then we basically control the next three blocks.
And there's no like probabilistic thing there at all right, the Poisson process
that we have in place today, that makes this harder will be completely gone. So
how does this change the, you know, the nature of MEV? Charlie Noyes: That's a great question.

Georgios Konstantopoulos: There's two
angles here. Right. So firstly, there's the angle that because if validators are
known ahead of time, there is an incentive to collaborate and try to to
collude, and so on. So I think that's valid point. If validator ordering i
known, that's not great. And so the onl thing right now, I don't think that'
supported by Ethereum 2. But there is, like, an arm of a branch of research tha
does that is on secret, private, li e election of validators. And so if you
ere to apply that, you would bypass hat limitation. In a similar vein,
endermint and other chains have like a V F module, which allows them to r
ndomize validator election. Charlie Noyes: I'll give you an
unsatisfying answer. I think I think we have to make, we have to hope the same
thing that we do currently with Eth reum miners, which is that they're not
oing to intentionally reorg blocks.

So f r, they haven't. But I would say that
with respect to, you know, known bloc producer, election in a proof of stake
world at least before to Georgios point we implement, like a secret election sc
eme, hopefully it doesn't happen, or f it does happen, we very quickly hav
a way to randomize that. That's a gr at question I actually haven't hea
d someone ask that before. Hasu: Georgios, I have a bit of an o
erational question, since you are very f miliar with the EIP-1559. So after
his, sort of many gasless transactions on't really be possible anymore that
re possible today. Because then every ransaction will have to burn some
mount of ETH. What is sort of the nteraction with Flashbots and these
inds of gasless transactions? So that's the one part.

The other part
is that you, instead of having the current model where each user has to pay
ether from their wallet, the model could in the future change to the block
producer needs to burn some ETH from their wallet. And that's it. And then
the block reducer individually negotiates with its users on how they
will cover his costs. But the block producer ends up being like the central
party that has to pay them the fee.

And that kind of makes the issue with
EIP-1559 non existent. Hasu: Yeah, I actually tried a while ag
to make a trade with zero gas that ha some slippage tolerance. So I tried t
get a try to pay via worse execution o Uniswap but it didn't yet work. Georgios Konstantopoulos: You wanted to
get intentionally front run so that hey … Right, right, Hasu: Correct. I didn't have I didn't
have any ETH in that in that account. S I thought, why not give it a shot. Georgios Konstantopoulos: There is a
nice blog post on ETHresearch by Lakshm n Sankar, who explains like this kind
f technique that you can frontrunn rs into putting your transacti
n on chain.

Hasu: Yeah, yeah, we'll link to that in
the show post. That's a great call. So e talked about MEV. And the biggest sou
ce of MEV is actually the dec ntralized exchanges on Ethereum. And
specially like liquidity being spre d across multiple exchanges, which then
allows for these arbitrage oppo tunities between them, where front bots
basically front run the price on one xchange, where trade happened to be out
of line with the other exchanges. You guys have incubated Uniswap, which is
the biggest, the biggest decentralized exchange by far and not just in terms of
adoption, I saw, you could say, so the made the DEX space. And Uniswap has a
new version coming out now. Georgios, ould you describe what's new in Uni
wap v3? Georgios Konstantopoulos: Uniswap, V3
its biggest feature, or rather differen e from v2 is that when you're providi
g liquidity, you get more capit l efficiency. What this means is th
t instead of providing liquidity to t e entire price range, as was in Uniswa
V2, Uniswap V3 allows liquidity provide s to provide liquidity between a certa
n range.

And they've been calling th t concentrated liquidity. And what th
s implies is that if you have, let's s y, 1000… a pool let's say with $1 m
llion worth of liquidity. In Uniswap V , you, you would be able to make, yo
know, some trade off whatever, like dep nding on the x, y equals k formula. But
in v3, if the liquidity is concentrated between, let's say, 0.99, and 1.01,
uch as would be in the case of a stab ecoin pair, perhaps like Curve, then ma
be then you can get the same amount f like, price impact, like for some
or some trade, but with much, m ch less liquidity.

And this directly tr
nslates to capital efficiency. So that's one big feature of Uniswap V3. Another
eature, is the rework on the oracles, w ich allows you to statelessly get arbitr
ry sliced TWAPs over the last week I believe, or sorry, whatever 65,
00 blocks, attributes do. So better p ice oracle's using Uniswap V3's liqu
dity. And we like we think that's one f the most understated features of it.
nd one or one could say that with Unis ap V3, basically, you you start to app
oximate something that looks like an ord r book at time. So basically, instea
of having an AMM curve, which is very s ooth and follows a certain equati
n, now it is made up of like many sm ller curves between all the ticks th
t get covered.

Hasu: Okay, so when I'm in an LP in Un
swap, and let's say like the price f ether goes up by $500, then how do I e
sure that my liquidity is the center d around the market price, and I'm not l
ft behind. Georgios Konstantopoulos: So it doesn't,
it cannot get down automatically. Right now, if you're a liquidity provider, and
the price moves outside of the ranges that you're providing, you would need to
kind of poke your smart contract your position and reposition it around the
new price. As a result, it could be probably expected that yield aggregator
or other sorts of outsourced infr structure would exist, and start to kind
of do these services for you. So in a wa , the v3 construction, and the fact tha
positions may need to be more actively managed means that there will be a whol
new ecosystem implosion on top of it.

Hasu: Do you see any risk that Uniswap
V3 will sort of kill the lazy liqu dity providers? Charlie Noyes: I think it'll become more
competitive, and there's gonna be a gap between, you know, the most sort of
efficient and competitive liquidity providers and those that don't want to
put as much thought into it. And I think there already is, to some extent, like
you see people taking, you know everaged positions to be able to
rovide more liquidity For a given apital base in Uniswap v2, but But o
viously, v3, like the delta is going t get much bigger.

And I think that p
obably, there will be, Well, h pefully, there will be a proliferation o
new types of aggregators that b sically like impro, implement, you k
ow, LP strategies for different pools t at are more competitive than just sort o
like the bog standard x, y k, full s ectrum default, I think that'll be q
ite interesting. And, and it'll mean t at, you know, it's, it's not like, l
ke, quote, unquote, normal users are k nd of just stuck with, with one option r
lative to a bunch of much more c mpetitive market makers.

I think it'll b
more interesting than that long term. You know, I think it's hard to say,
though, like, how the split will work out, hopefully, it'll still be, like
less, hopefully, it'll still give mor of an opportunity for for normal folk
to participate, then you see on lik centralized limit order books, where
and like, you know, back in 2015, I wa like market making on on GDAX, or at
Coinbase Pro now. And I'm certainly no onger doing that. I don't think that
ny normal person really is. And, an hopefully, that will still be possible
to an extent on Uniswap long term Hasu: So Georgios mentioned that y
eld aggregators could sort of provi e active LP management strategies f
r people. What are some oth r infrastructure projects that you wou
d like to see built on top of Uniswap, t at you think are low hanging fruits ri
ht n Georgios Konstantopoulos: Generally,
there should be tooling around managing liquidity, whether that is for following
the price, or whether that is for adjusting your quotes as they are their
relative market changes.

Right now, the basic smart contract only lets you input
like a specific one, like a specific range. And that will be for one
position, but it does not let you, for example, create the full distribution of
your records that you want to do, but t participate in the market with. So tha
kind of tool would be something that' missing. Something that does like more
xtensive simulations on LP returns, aybe there's work to be done on how LP
eturns, the optimal LP returns for niswap change with v3. And that might b
ild on Charlie, and and Dave's previo s wor Charlie Noyes: I think also yield
forming stuff.

It goes from being one big, you know, homogenous pool, to ver
heterogeneous and and and the curren uniswap. You know, liquidity minin
, base liquidity mining programs. I thi k I think there's gonna be very inter
sting stuff to be built around that. And Uniswap V3. Hasu: Yeah, I mean, implementing yield
farming on current Uniswap. Right, what they did when the uni token launc
ed, is very trivial. But now, it see s to be much harder to even do the sam
thing, right? Is that a solved prob em, or how to even do yield farming on
op of Uniswap V3? Georgios Konstantopoulos: So right now,
there is no canonical yield farming contract that's in the public, as far as
I know. But and the main challenge rather, is that you need to figure out
when a position is in range, so that you properly paid. And recently in the I
think, less than a week ago, there was a PR merged on the Uniswap V3 periphery,
v3 core repository, which changes the logic and enabled that so all the pieces
are in place, and somebody has to build the canonical Uniswap V3 liquidity
mining contract.

Another thing. Another thing that would
be nice to be implemented as LP shares as collateral in lending protocols, like
Maker or Compound. And pricing these is slightly harder than how you would price
the normal v2 shares. So there is some work to be done on that. Hasu: Right, that was actually one of my
first questions when I saw the the white paper for Uniswap, V3 was whether this
would make sort of leveraged yieldfarming obsolete. Sort of this like
I feel like when AlphaHomora came out, just sort of so insane success that I
wouldn't have predicted at all. But then I realized after talking to Dan that
basically Uniswap V3 makes leveraged yieldfarming obsolete anyway, because
the police positions are already implicitly leveraged and way higher. Georgios Konstantopoulos: Exactly, the
tighter the range you're providing liquidity to the higher your leverage. Hasu: Yeah, that's when I actually
became bearish on AlphaHomora instead of an Uniswap, V3, when I realized that
they are both competing in the same thing, which is concentrated liquidity,
right, that's all the leveraged funding actually gives you is concentrating your
liquidity around the current market price.

So you get more transaction fees,
and more yet, yeah. Though I take it that Uniswap V3, it does require a lot
of transaction overhead, right, you need to constantly message your.. like the
Uniswap pool where your liquidity is in. Depending on market market conditions,
and do you think this is even viable, on layer one, or will Uniswap V3, only,
like fully, like, come into, you know, shape on L2? Charlie Noyes: I think it's certainly
viable, because the base case is just as much interaction, as you see on uniswap.
Today, if everyone you know, is just full spectrum, x, y k, and then in terms
of the trade offs to like fees, costs of adjustment, frequency, and, and all that
stuff, I think it's just a like market discovery process. So it will definitely
be more efficient, I think, fair to ask how much more efficient and probably
difficult to say before, you know, we got some some kind of data in the wild. Georgios Konstantopoulos: For sure, like
lower latency and lower fees mean, that LPs will be comfortable with adjusting
their positions more frequently.

Charlie Noyes: Yeah. Yeah. Hasu: I am mainly trying to understand
like, how big is the on-chain footprint of Uniswap V3 actually going to be? It
feels like, there's a lot of incentive for LPs to constantly update their
positions, leading to a lot of transactions. Charlie Noyes: I think that's true. I
think it's also true, though, of many of the strategies that are implemented by
like, yield farming aggregators today, like one of the biggest value
propositions is socializing gas costs, and minimizing transaction overhead by
pooling, I think I've also returned Uniswap, V3, like there will be a higher
transaction footprint, and, you know, it's like a question of like, how much
higher, you know, relative to how efficient you want to get, and then I'm
able to, you know, with much lower latency and much lower fees, like the
markets will be more efficient. And then there's some question of like, trade
offs to, you know, whether whether you're just there to simply trade or you
want to buy the asset to use it in other protocols, and then need to, to move
back to the main chain.

So, I think there are a bunch of, I, I don't really
view any of those as, like, potential showstopper issues more. So just the
kinds of things where like, we're kind of going to have to see what the market
prefers in practice, and like, perhaps, you know, like, a lot more usage comes
from like, highly composable, like, main chain necessary activity, then we might
guess, today, perhaps a lot more of it is just purely trading based, and really
doesn't care, you know, at all for anything other than speed and
efficiency. And we'll just prefer to know, you know, like, we make guesses,
but it's just, I think, the kind of thing where, like, it'll be interesting
to observe Hasu: Based on what you just said I also
realized that, we will probably see a lot of bundled liquidity in some sort of
smart contract and then executes these active strategies. And it sort of puts
it all bundled it all into one transaction. Yeah, though, have you
thought about sort of the front running aspect of … providing liquidity to
Uniswap V3, because the miners can obviously, like, put in, like, update
their quotes? First and foremost, do you think this was sort of make miners as
the primary liquidity providers? Georgios Konstantopoulos: Miners?
Flashbots users? TBD, I think, Hasu: Yeah.

Okay. But we can say that
it's getting harsher out there. Charlie Noyes: It's getting harsher out
there. And I think the dynamics also, are, are so complex, that it's, it's,
it'll be interesting to see what happens. Georgios Konstantopoulos: But I mean,
it's not bad necessarily, right? If the liquidity ends up being very
concentrated, yes, you will end up taking a 2% slippage fee. But that might
be you know that or whatever your transaction was.

But maybe that means
that there the base fee that you end up paying, like for every transaction ends
up being, you know, the minimum acceptable slippage by the market. Hasu: Yeah, right. I'm just for the LPs
I'm concerned that, you know, for every trade that pays a fee, the miner can
just almost fill it completely. And then for trade that moves in the other
direction, they can always pull out the liquidity. Yeah, I guess I mean, for
users, maybe this is great. I don't know. I think it's fascinating to
speculate about, but I have no special insight and no strong intuition about
how any of this is going to play out. Georgios Konstantopoulos: And also, just
to say the obvious, you know, all these attacks need to be weighted against all
all the other benefits that you get. And it seems to me that the benefit here is
it's very much worth any additional complexity introduced the ultra. Yeah. Hasu: Also staying with front running
for a second. So we talked about one way that frontrunning is going to be
mitigated and some people already mitigate it today on the peer to peer
layer via gasless transactions.

Do you see what are sort of the options that we
have to further minimize this problem on on one the application layer, and to
also maybe on the protocol layer in the future? Georgios Konstantopoulos: Yeah, so we
can think of the mitigation layers for MEV as three, three layers. Firstly,
there is application layer, which is things that KeeperDAO and ArcherDAO are
doing, then there's the network layer, something like what Flashbots doing by
introducing a new API endpoint for transactions. And then there's also the
protocol layer, which is basically introducing ordering introducing
randomization in the transaction ordering, or kind of splitting the
processes of ordering and execution.

Where approaches here include either the
threshold threshold cryptography, sorry, using a random beacon to kind of
randomize your transactions, others are using a, you know, a VDF others, are we
threshold signatures, you can get creative on that layer. Hasu: Are we going to see any of those
in Ethereum? Maybe on layer two? Georgios Konstantopoulos: On layer two,
I think that's possible on layer one, may be some randomization based on an
insecure random random number, maybe the previous block hash or something, maybe
maybe it's good enough.

It won't be a perfect solution. But maybe you do maybe
like having something that's half good. Is better lunch. That's cool. Hasu: Yeah. So we addressed the peer to
peer layer. I became sort of curious what's possible on the application
layer? So maybe, are there any… can you construct the DEX in a way, for
example, using batched auctions? or something of that? Yeah.

Georgios Konstantopoulos: Yeah, exactly.
I think the best way to think about it is that when trading on a simple AMM
like Uniswap, there's only two variables that kind of determine your execution
price, it's the reserve one and the reserve of the other token, you, in
order to mitigate MEV, you need to reduce the overlap between two users
transactions. And this, you know, in uniswap, if there's two users, they
always touch the same to deserve values, because they're always trading on one
pair. Whereas if you have, for example, an order book. Now, again, in an order
book, like in an order book to user still can go after the same transaction.
But in a batch auction, everybody gets the same price independent of the order
that they submitted their transaction.

So here is another example of ordering
and then executing. Hasu: Wrapping up the MEV topic and
changing gears here a little bit. Paradigm has made many investments in
the Ethereum ecosystem. But Ethereum is not the only base layer you support.
Especially you, Charlie, you have been a vocal supporter of Cosmos. What is your
thesis? Cosmos is a bit different. It's not like
one blockchain. It's essentially a set Charlie Noyes: So I think Cosmos is one
of the few maybe the only project that is trying to enable like a vibrant
application ecosystem with practical of tools in developer tooling and in
various protocols that enable projects interoperability and sort of the other
the other features that that have allowed DeFi to grow up on Ethereum to
enable that kind of ecosystem without to launch on their own blockchain. And
the reasons that you might want to do sort of providing like a new platform
style blockchain. So if you think about Ethereum and many of the L2s that are
coming out for it, and sort of like this are that it gives you more fine
grained control over your environment, alternative, quote, unquote, smart
contract blockchains.

Essentially, all of them try to augment or offer like an
alternative platform to Ethereum. like both the execution environment and
the incentives of the system. And I think that we've seen like with Ethereum
that certain protocol layer issues like MEV, kind of necessitate like an
application layer response. And that's only possible to some extent, when
you're running on top of a shared Hasu: So is that something that you
would do use Cosmos for? I would love to platform, like no Ethereum application,
at least not without great difficulty, can re architect like, you know, the
fundamental sort of transaction ordering model of Ethereum. And Cosmos gives you
the opportunity to do that. hear like one or two examples, maybe of
applications that lend themselves specifically were to using an
application chain on Cosmos instead of building on top of Ethereum.

Charlie Noyes: Yeah, and so maybe a few
different examples, I think, like the Cosmos hub, and a few other Cosmos
zones, how they refer to block individual block chains that run on top
of the stack, that have DEXs like batch trades. And like enforced batched
execution of trades in each block. So this is like, essentially a transaction
ordering constraint that's, like specific to the application, and effects
front running. Like it's, it's a dynamic that, like you can't accomplish on
Ethereum. Today, at least not without VDF and a bunch of other like very fancy
tricks. I think that would be like an obvious example of, of something that's
enabled by Cosmos. Hasu: Right. But you do give up a lot in
return. Right? So let's say you build a DEX on an application chain on Cosmos.
Is a simple way to explain how it is possible that you might use assets from
other application chains on Cosmos, for example, the Cosmos hub? Charlie Noyes: Yeah, so Cosmos provides
the first generalized interoperability protocol that we've seen in crypto
actually called IBC, the inter blockchain communication, protocol, and
essentially, any blockchain with finality and efficient light client
proofs.

So most proof of stake blockchains everything within Cosmos,
substrate based Ethereum, etc, can all adopt IBC and use it to implement like
various different modes of interoperability, one of which, and it's
actually live today our cross blockchain asset transfers, so you can kind of
think of it as a generalized bridge. If I want to build for example, a lending
market within Cosmos like I can deploy that application to its own blockchain,
perhaps I'm the only validator perhaps I pay some people to do that for me,
whatever. And any user that wants to send, like, for example, atoms, from the
Cosmos hub, or like Luna from Terra's Cosmos zone, to my lending market is
able to do that without really any without sort of by default.

Hasu: You said it's basically like a
generalized bridge. How How does this work in comparison to a trusted bridge,
you know, one like WBTC, where we all know how it works on Ethereum? Charlie Noyes: I guess a couple of
thoughts on that. The first we I think it's probably better to think about it
just in terms of like IBC as a generalized bridge, versus like
idiosyncratic implementations, whether they be custodial in nature or not. So
like WBTC, or like Solana's bridge to Ethereum, or any of the Ethereum L2's
bridges to the main chain, or to BSC, or whatever, are all kind of like
idiosyncratic protocols.

They have different trust assumptions. Some of
them are more decentralized than others. Some of them are more custodial and
centralized. But like, broadly speaking, they are not like generalized or shared.
And this makes it quite difficult to get, like efficient interoperability
across a wide variety of platforms, you frequently have to route through, you
frequently have to route through sort of like hubs. And you end up with multiple
versions of the same asset on different platforms, depending on sort of the
route that you've taken.

IBC, in contrast, provides you the ability to,
in a generalized fashion, basically make pairwise bridges with minimal trust
assumptions. Hasu: Cosmos does have a native token,
ATOM, but it's not necessary to use ATOM to create your own application chain
using the Cosmos SDK. It's also not necessary to adopt IBC and you know, to
communicate with another blockchain in the Cosmos ecosystem. So one thing that
I hear a lot is that value proposition of of ATOM is basically it springs from
the Cosmos hub, which is a special blockchain, special zone, inside the
ecosystem. Could you say something about the what is the The basically the thesis
for the Cosmos hub? Charlie Noyes: So there's nothing
inherently special about the Cosmos hub. And maybe that is what's special about
it. Sort of unlike in any other project, the Cosmos hub is not privileged by the
protocol. It's on an equal footing to any other zone, it does happen to sort
of have natural centrality, or be a natural selling point for the
development of the protocol.

And for users and developers of other Cosmos SDK
chains. And so today, it's imagined that, or I guess, I should say, in the
in the short to medium term, the Cosmos hub plans to provide like certain
functionality that requires a high degree of trust and credible neutrality.
So some examples of these would be bridges to like Ethereum, and Bitcoin.
And then in the future, it's imagined that the Cosmos hub will adopt a shared
security model and […] towards a, an end state that looks likely broadly
similar to like a roll up based ETH2 or Polkadot's main chain, the mechanism for
it just hasn't been decided yet. They decided to build the rest of the
protocol on the interoperability spec first, rather than like a main chain
first.

So it's kind of just a different approach. Hasu: What do you mean by shared
security? Charlie Noyes: Shared security in Well,
the literal sense, like many blockchain protocols, I guess, many smart contract
blockchains, like start with either a generalized execution environment that
anybody can deploy applications to, or in the case of something like Polkadot
where, sorry, in which case, like, you know, within that an environment, they
sort of all share the same security that the Ethereum of the security of all
Ethereum applications, you know, is uniform across the platform. And there
are other examples of blockchains like Polkadot, which, although it doesn't
provide, like a generalized execution environment, or smart contract
functionality, on its main chain directly, it does provide like a built
in auction mechanism to allow different applications to bid for the rights
essentially, to share security with that main chain, you know, which in theory
has sort of like the most credible validator set and like highest degree of
security.

Cosmos in contrast to these approaches, like rather than starting
with a shared platform, or an architecture, for shared security
started with interoperability. And then, you know, it's going to build towards,
like, whatever long term vision makes the most sense, with respect to sort of
like the network security topology. And so I think of them as just kind of like
working towards likely the same end state from from sort of opposite sides
of the problem. Hasu: Aha. Okay, so we established that
applications can build their own blockchain using the Cosmos SDK, but
they are usually they are proof of stake blockchains. Right. So they need in
order to generate security and finality.

They need validators and how are they
going to get those? And I think what you suggested is that they can basically buy
validators from, for example, from the Cosmos hub, but also maybe from other,
you know, other places. Charlie Noyes: Yeah, I mean, I think
that that's like a, like a simplistic and like reasonable model for security
sharing. Like in the short to medium term that doesn't require life, sort of
deep cryptographic or, you know, incentive design work.

I think like long
term, there is sort of an open question as to like, what an ideal, you know,
like shared security layer for all cryptocurrency applications or different
ecosystems of applications. You know, potentially different niches, depending
on their use case. Looks like whether that's a generalized execution
environment, whether that's a sharded execution environment, whether that's
like a roll-up centric ETH2 to whether it's something like Polkadot, you know,
with its parachain auctions, and things like the guys will have a likely
experiment with multiple different models. One of the first will probably
be leasing validator sets, without like, sort of more direct incentive
engineering around it. Hasu: What this then mean that sort of
so I have my app chain, I need validators, I go to the Cosmos Hub. And
I basically, they they already have their ATOM staked. And I just say,
please validate my chain as well. And then that ATOM becomes exposed to double
staking risk, basically. Right? Charlie Noyes: Yeah, it's essentially
sharing the slashing risk. I mean, there's multiple different, like modes
you could imagine, one is purely social, which is, if a validator on the Cosmos
hub, or a set of validators from the Cosmos hub, which have, like significant
money and credibility at stake, you know, come and validate your chain, or
you pay them to do that, and they become malicious.

Like, that's going to have
social consequences. And I think is, you know, honestly, not that dissimilar of
an assumption to, like weak subjectivity. But you can take it a step
further, and you can actually couple their slashing conditions such that if
those validators, you know, violate or like make an invalid state transition on
on your blockchain, then they get slashed on the hub too. And you can tie
the incentives together directly. And there's like a very broad spectrum of
potential protocols. In this vein, I think that like, probably many of them
will make sense in many different contexts. And the Cosmos hub is like an
interesting vehicle for their exploration. Hasu: Yeah, it's pretty interesting to
think about a blockchain where, you know, the validators are not paid in…
they don't stake the blockchains native token.

So it feels like they might
maybe, you know, less incentivized to protect the health of the blockchain, if
the token that they stake is not exposed to, you know, it's not basically a bet
on behalf of the blockchain. Charlie Noyes: I'm not sure that I would
fully agree with that. And but to the extent that that is true, it's an
incentive security issue that shared by like, well, any shared security layer,
whether it be a generalized execution environments, or one that's directly
leased. I think that's probably more of a general comment about non application
specific security layers.

Hasu: Maybe the bigger point would be
that this, you know, just the idea of application layers, doesn't this force,
application developers to be protocol developers as well? Charlie Noyes: Well, I think they're
already forced to be protocol developers, I mean, in practice. Hasu: And that is because they have to
understand all the idiosyncrasies of the blockchain they're building on or why? Charlie Noyes: Well, they have to
understand all the idiosyncrasies of the blockchain, they're building on, like
building on something like Ethereum doesn't obviate MEV, or, you know, you
know, sort of prevent your applications exposure to it, or other things like,
you know, the communication layer.

Like we have all the infura stuff, you know,
we now have various different private memory pools and this type of thing. So
like those, those are all sort of features that have like meaningful
implications at the application layer, and that application developers are
probably going to have to contemplate, like if they choose not to, it's at the
detriment of users. So I think they're already kind of forced to be protocol
developers, and many of them may want like more control over their environment
and the ability to, you know, fit the protocol to their applications. Georgios Konstantopoulos: I agree with
what Charlie said, and I think that they a underestimated aspect of the Cosmos
ecosystem is the SDK, which allows you to very easily build these chains, by
using all the kind of community made modules for providing very common
features that you will need when building a blockchain instead of having
to rebuild them on your own.

Charlie Noyes: Yeah, and, you know, I
might get in trouble for saying this, but like, even to the extent that you
don't like the Cosmos thesis, and and you're not interested in building like a
Cosmos based blockchain like, there are still many parts of the protocol that
are probably going to be purely additive to everything in crypto like IBC as a
generalized communication layer. Is the interoperability protocol that should be
adopted by essentially everything. Like every, every, every blockchain and and
so I think like, when you reframe the conversation that way, you can maybe
start with like, are certain applet, certain applications going to want to
build their own blockchain or have their own blockchain and design their own
protocols? And if your answer to that is like, feasibly yes, then Cosmos is
probably your best bet right now.

Georgios Konstantopoulos: In the same
vein, I think it would be valuable for us to eventually have a Solidity
contract or an Ethereum precompile for accepting IBC type messages. Charlie Noyes: Yeah. Hasu: On, what is the way that you would
use IBC on Ethereum? Right now? Charlie Noyes: Well, every single L2 and
every single non-Ethereum platform right now has its own bridge. That doesn't
need to be the case if you adopt IBC. And any protocol […] can. So those
should all most likely be replaced with IBC as as like a generalized spec, Hasu: is this viable, Georgios? Georgios Konstantopoulos: I think so
currently, all the rollup systems, they, they either don't have an
interoperability protocol or plan right now for talking with each other. And
they will either have to roll their own bespoke one, which will have to follow
some kind of message format. Or they could try to leverage one of the
existing ones. And I think that using IBC would be a good choice. Similarly, I
think that interoperating, for example, with the Polkadot ecosystem, I think it
would be valuable to adopt IBC over the XCMP protocol that they're currently
using.

Hasu: And is this a case of "there are
20 different standards? This is not sustainable. So here, let me make the
20… You know, the 21st standard?" Georgios Konstantopoulos: I think that
this is not the this is not a case of there's 20 standards, and let's make the
21st. This is the first or second interoperability standard that has been
has always been the expectation, my opinion, that this should become the one
thing for chains to communicate over? Hasu: Why is it that all of those
Ethereum layer twos don't adopt IBC? Now, what is their, what is their
thinking? Charlie Noyes: I mean, it's just
relatively new. Georgios Konstantopoulos: Because
there's so many things to do, and so little time, and it's still very little
very new as Charlie said, as a protocol, Hasu: How difficult is it for an
existing layer two to rip out their protocol and replace it with IBC? Georgios Konstantopoulos: I think
trivial because you would implement it as a smart contract.

That's the beauty
of using smart contracts. They don't require protocol changes, you just
define it in your smart contract, which implements whatever you want. Well, for
IBC, it is just a message format, more or less, and some validation rules on
that message format. And that's about it. And you would then need to maybe
adjust your client code on how they would interpret that these messages. But
I don't think it's a it's a large overhead, assuming that you have clear
semantics around the protocol. Hasu: If we entertain this idea and say
that IBC will be adopted by more blockchains in the future. What is the
impact on the Cosmos ecosystem? Does it just, you know, do the borders
disappear? Or what happens? Charlie Noyes: I think it's very
possible that the borders disappear. I mean, there's some joke that like
everything is a Cosmos chain.

That's like not really a joke. Because Cosmos
doesn't privilege any chain. Like we said, at the hub, it's not special in
any way. And neither neither is Ethereum. If Ethereum, or any
application developers on top of it decide to support IBC and interoperate
in that way. I think like, again, philosophically, you know, if Cosmos
works, yeah, it won't have borders, and it will make it possible for the first
time to like in a reasonable and practical manner interoperate between
different blockchains without like, a bunch of idiosyncratic middleware. And I
think that's that's not a question of like dissolving Cosmos borders as a
question of dissolving like borders between all blockchains Hasu: so Cosmos is like a bet on
blockchain globalism? Charlie Noyes: Sure.

Yeah. Hasu: I guess it doesn't privilege any
blockchains but what about Bitcoin for example? I mean, could Bitcoin ever
adopt IBC? Or will it forever? Charlie Noyes: No. No, I want Bitcoin
could choose to adopt IBC by having like a, like a precompile. And like, like
choose to, you know, sort of process the consensus mechanism, or the consensus
messages of remote blockchains like Ethereum or the Cosmos hub or whatever,
but like Bitcoin will almost certainly never choose to do this. There's like
like near 0% probability, I would guess, perhaps, you know, as unlikely as
Bitcoin supporting smart contract functionality generally. And you know,
Bitcoin is unique in that it doesn't matter have finality, and unable to like
understand the consensus mechanism, messages of remote blockchains so it is
unable to adopt any interoperability protocol. And that's why every Bitcoin
bridge that you see, essentially needs to build like the crypto is either
custodial or needed to build the crypto economic incentives to secure the
interoperability mode, like on the remote blockchain, so whether this is
like in WBTC's case, obviously, it's just custodial, in KEEP, which uses
collateral.

You have? I don't know, I guess other, you know, forms of
interoperability, but notably, none of them really involve Bitcoin it's all,
you know, building scaffolding around it, because the Bitcoin protocol is
blind to the world around it, Hasu: I would say that sort of the
sentiment around Bitcoin, especially with regards to you know, Ethereum is,
even though Bitcoin is trading at like 50k, or something, it's pretty low. And
that's because Ether has, you know, made some advances to becoming more a store
of value, like with EIP-1559 and proof of stake, could reasonably in a year
have lowered inflation than Bitcoin. And that, of course, you know, has made a
dent with a lot of people who now see it as much more desirable. And even to the
point where some people, you know, expect a Flippening this cycle.

And I
don't want you to, I don't want you to comment on the price, because I know
that's not really possible. But I do want to sort of leverage the fact that
you are known to have an extremely long time horizon, when it comes to, you
know, your allocations to crypto. And I just want to hear from you. If you think
that your thesis for Bitcoin is still as intact? Charlie Noyes: If you think about it, in
terms of use cases, I think bitcoins use case as like a maximally censorship
resistance, store value or non sovereign asset is intact. Bitcoin has a function,
the ANA purpose, and I think it fulfills that elegantly. It's quite minimalistic,
really where the theory, um, I think that up until relatively recently, like
potentially sometime, you know, early in early 2020, and there was like, some
broad question as to whether there were going to be use cases that had like,
like, like that were meaningful.

And then we had sort of the de fi
Renaissance. And at this point, I think that there were like quite a lot of
products that make, like fundamental sense and feel like they should exist
and have like a purpose in the world that are on top of Ethereum. And that
millions of people are using, and I think like, with respect to ether as the
Ethereum network's native money, it seems to be like gaining adoption in
that use case. And I guess how I think about it less in terms of competition,
and more.

So just like ether is succeeding, because Ethereum is
succeeding. It's not obvious that there's like near term competition, at
least to me with Bitcoin. Georgios Konstantopoulos: I agree with
that. Also, Ethereum's hard forks sometimes may make it hard for people to
think about it on longer term horizons. So I think that this kind of difference
in the governance layer kind of paints a very, like clear place for both of these
to coexist for a very long time. Hasu: Is there anything that you would
like changed in Bitcoin? Or do you think that the way that it is it can, you
know, carve its niche and survive alongside these other crypto assets that
are more rapidly adapting to sort of them the demands of the market? Georgios Konstantopoulos: For me,
there's two things.

Firstly, I would like a minimal change to the scripting
language, which would allow us to build more flexible protocols for off chain
scaling, and for key management such as vaults. And I think that that may be
hard to make the case today, but I think it may become a very popular discussion
topic in the future, that we may end up needing a finality gadget on top of
Bitcoin. I don't know how that will look like if that will mean that there is
some proof of stake component that ends up built being built on Bitcoin or
something like that. But I think that we will need a finality gadget too as the
block reward approaches zero. Hasu: Okay, cool. I'd love to unpack
those. So starting with the vaults. So what is it vault, Georgios? Georgios Konstantopoulos: A vault is a
system which allows you to maintain custody of your funds, even if one of
your keys gets stolen. Basically, you have a hot key and the cold key and if
your hot key gets stolen then you can always use your cold key to claw back
your funds and recover them.

And today, this kind of protocol can be
implemented. But it is tricky to do. And it kind of has very specific
limitations. Whereas by augmenting the scripting language, we could get a more
easy to implement and more flexible protocol. Hasu: Are vaults only useful to improve
sort of custody solutions around Bitcoin? Georgios Konstantopoulos: Yes. Vaults
are an improved custody solution. Hasu: Okay. What is what is difference
between a vault and a covenant? Georgios Konstantopoulos: Vaults are
built using covenants. Charlie Noyes: What are the hot keys
used for in Bitcoin? Like in in Ethereum? You know, you would you would
your hot key would normally be like a staking key or a voting key or whatever. Georgios Konstantopoulos: Assuming
there's you want to do at transfer? Or that's a lightning, maybe it's a
lightning wallet. Hasu: I mean, an exchange needs a hot
wallet Hasu: …because they need to process transactions. Georgios Konstantopoulos: Exactly. So
the one the one change is the vaults that would make that would be nice. And
the other is the off-chain scaling solutions. Currently, it seems that
lightning has had trouble with scaling the amount of assets inside of it.

And
it's not clear if this is about usage, or if it's due to a fundamental due to
the fundamental limitations of payment channels around capital efficiency.
Hence, I think that modifying the scripting language in a way that would
allow introducing some kind of protocol that looks like the very popular rollups
in Ethereum, would be a net positive for the system. Hasu: Oh, so you could create a version
of rollups in Bitcoin, and it requires only sort of minimal changes to the
scripting language? Georgios Konstantopoulos: I'm not
claiming I can, I'm saying it would be nice. If that were the case. Charlie Noyes: I mean, you could imagine
like, like enshrining like a SNARK for only proof checking, like a UTXO off
chain environment. I think historically, Bitcoin developers have said that, like
the cryptography is too new. Yes. But you know, long term that seems like,
like a minimalistic sort of change. And like, seems reasonable, I guess. Georgios Konstantopoulos: Yes, exactly.
So and OP_SNARK_Verify, or OP_STARK_VERIFY opcode in the future
could be something which satisfies this condition. While we're added, at the
topic of the changes, I just had another thing come to mind, which is an EIP-1559
like mechanism could be also net positive.

Hasu: And by that, I guess you don't
mean to introduce permanent inflation, like EIP-1559 has, but rather to spread
out the rewards maybe? Georgios Konstantopoulos: Yes, there's
EIP-1559 defines there is, there's various ways you can implement the fee
burn. Yeah, or rather, the component where the miner does not get the entire
reward of the block. And hence, like Ethereum cho.. Ethereum's implementation
chose to burn it. In this implementation, we could do something
else, we could choose to distribute the block reward over the next 100 miners,
for example.

Hasu: Yeah, I think that would be a good
idea. Georgios Konstantopoulos: So in general,
these are the three kind of changes that I would… its scripting language
related changes, final the gadget, EIP-1559. Just to be just to be very
upfront. Yeah, I think that Bitcoin as is today. It's, it's fine. And I don't,
I don't think that these are nice to haves, these are not must haves, Hasu: Ah, let a man dream! So I was, I
do want to talk about the finality gadget. So what does it mean? Georgios Konstantopoulos: It would mean
that you introduce an additional layer on top of the proof of work consensus
that says, every, let's say 100-200, whatever number of blocks, that becomes
the last block in the system. Yeah. And reorgs cannot be triggered past that
number. Hasu: Is it just a different word for
checkpoint? Georgios Konstantopoulos: No, no,
because a checkpoint is an overloaded term. When people say checkpoints, they
seem to think that you can, that you start to discard the entire chain
history and all that, which is not great.

Whereas what we're talking about
is just disallowing reorgs past a certain threshold. You could argue that
today there's already some kind of implicit social consensus around the max
reorg depth, which would be the time after which a coinbase is Is spendable?
Yeah, which is 100 blocks. Um, but I think it would be nice if this detail
could be enshrined in the protocol.

Charlie Noyes: Safety is already a
collective hallucination. Georgios Konstantopoulos: Yeah. Charlie Noyes: I remember when the there
was an inflation bug in Bitcoin and I made like a Twitter poll at the time.
That was along the lines of, you know, if there were a certain number of
Bitcoin that we knew to have been minted through it, like 100 blocks ago, would
we be willing to unwind the chain? And go back? And I think, like, it's a very
tricky question in the same way that like, as of right now, to like, the
collective hallucination of, you know, six block finality. Like if there is a
50, block reorg I don't expect that it would be accepted by the network. You
know, at a social layer, I just don't think it would happen.

And so I think
like the finality gadget point is like, if you're already kind of in that state,
and this is already an assumption that you're making implicitly, then, you
know, if you have like, like some, like reasonable crypto economic mechanism to
try and train it more directly, then seems like a reasonable idea. Hasu: Yeah, I agree. So and the way that
we do this would be just to tell our nodes not to accept reorgs deeper than x
blocks, is that right? Charlie Noyes: That would be one. I
mean, it has like very, it probably has worse security properties than then one
in which, you know, you make like a 12 month liveness assumption and have
something like the CASPER FFG. Or finality gadget with like a proof of
stake overlay. Hasu: How does it work? Like? Where does
the proof of stake come in? What does what do the stakers do? Georgios Konstantopoulos: There would be
a committee of validators elected pseudo randomly weighted by their stake where
the stake would be denominated, let's say in Bitcoin.

And these validators
would sign on a block hash, which would be considered the checkpoint … the
point the point of no reorg. Charlie Noyes: I don't think the CASPER
FFG had a suit had to have a pseudo random. Yeah, so I think it was a
signature of all validators. Right? Like, Georgios Konstantopoulos: it doesn't
matter. I didn't, I did not call it Casper FFG. I just said that there's
there needs to be some collective signing via a committee. Hasu: How, like, how deep would you
personally put this point? Is it like every Is it like 100 blocks deep? Or? Georgios Konstantopoulos: I don't know,
I think this is up for debate, simulation, and other things that are
beyond my, my security clearance, Hasu: Would have the interesting side
effect that you would turn, you know, you could allow people to stake their
Bitcoin, even if it's just for a small reward.

Yeah. Georgios Konstantopoulos: Maybe if that
allows you, for example, to end transaction fees, turning Bitcoin into a
productive asset trustlessly? Hasu: That would be nice. Charlie Noyes: Yeah, I also like, I
think there was some someone had thrown out an idea out there for charging
transaction fees on the basis of coindays destroyed, which always to me,
sounded, like potentially interesting long term you have like on on Bitcoin,
much fewer, you have like very few ways to kind of game that system securely.
Whereas like on Ethereum, it's very hard to do. You know, for example, like
charge a percentage transaction fee, because there are an infinite number of
ways to implement a transaction and, like, avoid the constraint, right. So
yeah, whereas on Bitcoin, it's like, probably, I think, probably feasible
within the existing protocol to like, charge that coin is destroyed based
transaction fee without, like, you know, a secure way to game it. Hasu: It's a beautiful example of why
soft forks can indeed be evil. Charlie Noyes: Certainly, yeah, Hasu: Yeah.

I mean, this is a super
interesting topic, this whole, you know, basically price discrimination. What do
you what do you think about that, in general in blockchains, like should
blockchains price discriminate? Because right now, there seems to be a strong
consensus that they shouldn't. Charlie Noyes: I feel like they probably
shouldn't, you know, in some philosophical sense, but I in practice,
it's completely unsurprising they do. Georgios Konstantopoulos: Objectivity at
scale is hard, like, fundamentally. Hasu: So, we mentioned the finality
gadget, what about something like, you know, consensus before the full block
has been mined, you know, like, 0.5 block finality like, it says, Is there a
way to do this, you know, to get To give give us a bit like lower latency on
Bitcoin, like a pre consensus thing? Georgios Konstantopoulos: Like the type
of proposal that was done for running the Avalanche consensus, for example, on
Bitcoin Cash blocks? Hasu: Exactly.

I really liked that
proposal to be honest, I wish Bitcoin cash would have done it at the time. Georgios Konstantopoulos: I'm not well
enough informed on that specific proposal to give you a good enough
comment. Charlie Noyes: Yeah, hypothetically,
though, you could have like a, like a head chain, or like a header like how on
ETH2 there's a finality gadget, I believe, like a certain number of blocks
back from the head. And you could do something similar on Bitcoin. Georgios Konstantopoulos: I mean, the
moment that you introduce an additional consensus, the first problem is getting
people to agree that it's okay to introduce an additional layer of
consensus participants, other than the miners. If you do that, I think that we
can go crazy with designing protocols, which improve finality, or which
improve, like latency or anything like that. I don't think that the hard
problem is designing a protocol that would be useful like that, I think that
the hard problem is actually getting everyone to agree on it.

That and
getting everyone to agree on the fact that there might be an overlay a
consensus overlay layer, maybe one day. Hasu: Okay, two questions as a follow up
to this the first. Yesterday, Elon Musk has sent two tweets that basically
criticized the energy usage of Bitcoin and this this strikes right at the thing
that you just said, which is, you know, should we pay miners? Or should we, you
know, get consensus via some other method. Do you see this as Do you see
Bitcoin ever abandon proof of work? And if not, you know, do you think this will
eventually become a problem? Charlie Noyes: I think the environmental
questions are worthwhile. But in general, I've found that the arguments
are not super salient, like I think Nic Carter has, has written a lot on like
bitcoins, you know, energy usage.

And, like, so maybe the question, I would
take it less as, like, should Bitcoin switch off of proof of work? Like, given
the energy consumption, like on a substantive level? and more like, if
this is going to be a common criticism of Bitcoin? Like, should it just switch
off of proof of proof of work? So that to obviate the criticism? I would guess
probably not, that's probably not a good enough reason to get folks on board. But
potentially long term and in concert with a declining block subsidy? You
know, I don't I think it's possible to imagine that like 2050, Bitcoin, has
adopted like, a hardened, tested proven proof of stake, consensus system or
overlay. Georgios Konstantopoulos: Taking the
opposite side on this, I think that the whole approach of Firstly, using mining
heat emissions to do other things inside the cyclical economy, I think that's a
very valid thing which we can use, let's just assume that we cannot get rid of
the proof of work, because it, it may be it has to stay because we value
objectivity, a lot.

So we have to, like figure out what to do with it. I think
that only now, is the problem getting the dimensions that quote, unquote,
deserves, and only now Are we going to see the proper measures or solutions,
creative solutions developed to address or maybe utilize this effort. So for
example, for me at least the the use case, where if there is excess energy,
you use Bitcoin as a quote unquote, energy battery, maybe many people would
call it like, not not a great idea. Personally, for me, I think that there's
lots of areas where we have a lot of sun, a lot of sun, and all that energy
just doesn't get used at all.

Or, for example, when you want to transmit
energy from A to B, like lots of energy gets lost along that cable. So you know,
I'm not like posting like a or b, I'm just saying that it's worth digging hard
at the problem and seeing if it can be inverted in some way. Instead of just
saying, you know, it's bad. Let's give it up. Let's move to the next. Hasu: By "inverted" you mean to actually
turn it into a strength of Bitcoin right? Georgios Konstantopoulos: In some way,
or a new use case.

Hasu: Yeah. I mean, you do see like
credible reports that you know, the the energy grid actually can be stabilized
in many parts of the world. Georgios Konstantopoulos: Yeah, so
little known fact that I studied electrical engineering. And yes, all of
this stuff like around the load balancing of the grid, when there is
more load than then you turn on your Bitcoin machines to consume the load or
the opposite or you turn off the machines. This, this is a very valid use
case, okay? Or even just that simple, you have in Bitcoin miner, it emits
heat, you use it to, you know, warm your greenhouse, the product that will
utilize all these forces have not been created. Hasu: I had a second question. And
again, so I don't want to comment on the likelihood of this at all.

But if
Ethereum or another coin was to flip in Bitcoin, and Bitcoin loses its number
ones. But do you think I'm curious to explore what this means for Bitcoin in
the culturural slash governance sense? So we talked about how it's very
difficult to change Bitcoin today, because he just wouldn't get consensus
on anything. And you know, people rightly have adopted this, like, very
conservative mindset that the rules are set in stone forever. Do you think that
this is something that could change if the Bitcoin community was to feel more
competition and urgency? Do you think there could be like a quote unquote,
cultural awakening? Charlie Noyes: I mean, sure, but you
know, we probably should title the episode.

First, just to state it right?
I'm sure we get tweeted at for this, but like, like, two Ethereum dudes talk and
speculate about Bitcoin, like, I mean, I wouldn't profess to be very deep in the
Bitcoin community or have any special insight. Hasu: Oh, you see yourself as an
Ethereum dude? Charlie Noyes: I don't see myself as
Ethereum dude. I just mean, I think that there is a defined subset of the crypto
community that are very specifically Bitcoin people. And we're now
speculating on their social dynamics, which could be completely wrong. Oh,
yeah. But, I mean, yeah, just for like, you know, say the disclaimer or
whatever. That being said, like, sure. Yeah. I mean, it seems like if, you
know, if Bitcoin felt less secure in its use case, then perhaps we're forced to
make chan.. or Bitcoin feels forced to make changes, like historically, I
think, you know, probably the, the closest the issue has come to coming
ahead, coming to a head was like, the block size stuff.

And, you know, I think
that was successfully argued, as unnecessary. And Bitcoin's obviously
been quite successful since then, and still feel secure in its use case. So I
don't know, perhaps, perhaps there's never a problem that sort of moves the
Bitcoin community's setpoint. And it always kind of remains like, We're fine.
And, you know, this is unnecessary? And I don't know, perhaps not. And, and at
some point, you know, we see, I don't know, we see Bitcoin proof of stake, I
think it would be cool. Georgios Konstantopoulos: But yeah, I
think that I see myself as somewhere in between, because I follow all of the
Bitcoin protocol development, although I don't think that we are people that…
nothing that we say kind of is about the community. It's more about, like our
independent thinking, as engineers, investors, protocol designers, and all
that. So it's more of an entertaining thought experiment. Hasu: Thank you so much, guys, for
coming on the show.

I think it was a great discussion. Thank you. All right..

You May Also Like