7 Blog Posts about
Tech Future
Show Topics

proteins do more than serve as the building blocks of the body — the ones that serve as mostly static structural components are actually a special case. More generally, proteins are self-assembling nanomachines that do almost everything in the body. Your cellular processes — everything which can be said to make you alive — are tasks carried out by proteins.
Proteins are defined linearly. They are coded by strings of nucleotides in your DNA and RNA. They are formed by chains of amino acids reacting with each other. But despite this simple linear identity, proteins act in time and space. Once produced, atomic forces cause them to self-assemble into messy 3D structures that determine their function. Proteins are fundamental to pharmaceutical research, where scientists are often trying to find a molecule that will activate or inactivate a particular protein. Since we only know the structures of around a quarter of the proteins in the human body, this has often been a trial-and-error effort. By using AlphaFold 2 or its successors to create a catalog of the structures of every protein humans can produce, scientists will be able to reason about which molecules could be good candidate drugs, dramatically reducing the error rate. This, in turn, could turbocharge drug development and enable the discovery of cures for almost every disease. We may even discover that already-approved drugs can be used to treat conditions we hadn’t tried them on yet.
In a future pandemic where we might not have experience with similar kind of a virus in the past, the ability to map the structure of its proteins, we could determine what kind of molecule would be needed to inactivate it. nstead of blindly experimenting with random antimalarials, we could reason about which existing drugs could be a first-wave therapeutic. This could save countless lives.
There are a few caveats though -
1) Protein nanomachinery is dynamic, but AlphaFold only predicts fixed protein structures. This limitation is a consequence of the fact that our existing techniques for empirically determining the structure of a protein — X-ray crystallography and cryo-electron microscopy — capture a static structure only. This static picture is the ground truth against which AlphaFold was trained. While AlphaFold has essentially solved the static structure prediction problem, there is a further rabbit hole of dynamic behavior to understand.
2) The time to actually operationally adapt it might take many years before we see it having any real influence in the world.

Read More
Hide
almost 3 years ago

To check later - Golem for performing computations, filecoin for decentralized file storage.
The argument is that from 1980s-2000s people were using a lot of community built open source protocols, but post 2000s FAAG came up and OS protocols couldn't keep pace. The advantage was access to internet to millions with compromised control, trust and privacy.
He says this discouraged innovation as people started to think things were out of their control - I understand why, but can't intuitively see that link so strongly as to place the onus upon it. " Centralization has also created broader societal tensions, which we see in the debates over subjects like fake news, state sponsored bots, “no platforming” of users, EU privacy laws, and algorithmic biases." - Again can't see the link between centralization and most of these, especially fake news - why will it propagate less on a decentralized platform?
Let’s look at the problems with centralized platforms. Centralized platforms follow a predictable life cycle. When they start out, they do everything they can to recruit users and 3rd-party complements like developers, businesses, and media organizations. They do this to make their services more valuable, as platforms (by definition) are systems with multi-sided network effects. As platforms move up the adoption S-curve, their power over users and 3rd parties steadily grows. When they hit the top of the S-curve, their relationships with network participants change from positive-sum to zero-sum. The easiest way to continue growing lies in extracting data from users and competing with complements over audiences and profits.
Cryptonetworks - Cryptonetworks are networks built on top of the internet that 1) use consensus mechanisms such as blockchains to maintain and update state, 2) use cryptocurrencies (coins/tokens) to incentivize consensus participants (miners/validators) and other network participants. Some cryptonetworks, such as Ethereum, are general programming platforms that can be used for almost any purpose. Other cryptonetworks are special purpose, for example Bitcoin is intended primarily for storing value, Golem for performing computations, and Filecoin for decentralized file storage.
Cryptonetworks use multiple mechanisms to ensure that they stay neutral as they grow, preventing the bait-and-switch of centralized platforms. First, the contract between cryptonetworks and their participants is enforced in open source code. Second, they are kept in check through mechanisms for “voice” and “exit.” Participants are given voice through community governance, both “on chain” (via the protocol) and “off chain” (via the social structures around the protocol). Participants can exit either by leaving the network and selling their coins, or in the extreme case by forking the protocol - this ends up aligning network participants to work together towards a common goal - growth of the network and appreciation of the token.
Today’s cryptonetworks suffer from limitations that keep them from seriously challenging centralized incumbents. The most severe limitations are around performance and scalability. The next few years will be about fixing these limitations and building networks that form the infrastructure layer of the crypto stack. After that, most of the energy will turn to building applications on top of that infrastructure.
Why Chris thinks decentralization will win?
- Decentralized networks can win the third era of internet for the same reason they won the first era : by winning the hearts and minds of entrepreneurs and developers. In the case of cryptonetworks, there are multiple, compounding feedback loops involving developers of the core protocol, developers of complementary cryptonetworks, developers of 3rd party applications, and service providers who operate the network. These feedback loops are further amplified by the incentives of the associated token, which — as we’ve seen with Bitcoin and Ethereum — can supercharge the rate at which crypto communities develop (and sometimes lead to negative outcomes, as with the excessive electricity consumed by Bitcoin mining).

Read More
Hide

LOTs and lots of sublinks inside the article, worth digging further.
Things Patrick is optimistic about in 20s-
1) Opportunity due to internet
2) Progress in biology -"I think the 2020s are when we'll finally start to understand what's going on with RNA and neurons."
- One suggestion is that RNA is actually part of how neurons think and not just an incidental intermediate thing between the genome and proteins.
3) Energy Tech - Cheaper Batteries, renewables . The second order effects - decrease in air pollution will be huge.

Why has growth slowed? Why was it fast in the 20th
century?
- We might never be able to see that kind of growth again, given demographic headwinds of that time among other factors.
1) Science itself is changing a lot - Federal spending on R&D is about half of what it was in the 70s and 80s, as a percent of GDP. For example, peer review in the modern -- legitimacy-conferring -- sense of the term is a postwar invention and arose more as a result of funding bureaucracies and controversies than any scientific exigency. Thinks that many changes that we have made may not have been for the best.The how of science matters more than what.
2) Cultural shift -do we just not want good things? As Ezra Klein recently described in the New York Times, and Marc Dunkelman has written about in his great piece about Penn Station, a particular version of distorted, hypertrophic progressivism that took hold in the 1970s may have had (and still be having!) quite significantly stifling effects. We perhaps shifted from placing emphasis on our collective effectiveness in advancing prosperity and opportunity for people to the perceived fairness that was embodied in whichever particular steps we happened to take. Or, to say that another way, we shifted our focus from sins of omission to sins of commission.
3) Instituitions and first mover disadvantage - institutional dynamics and how principal/agent issues and collective action problems seep into our systems over time. "The period of the early twentieth century was an era of building in the broadest sense, from universities to government agencies to cities to highways. The byproduct of this period of building is maintenance and we haven't figured out how to meta-maintain -- that is, how to avoid emergent sclerosis in the stuff we build."
4) Talent allocation - maybe the talented people are working on the wrong things
5) Our explanations are wickeder - While we have to be careful to not over-diagnose explanations involving low-hanging fruit (since they can easily be excuses), I think it is clearly the case that the major open problems in many domains involve emergent phenomena and complex/unstable systems that often have lots of complex couplings and nonlinear effects and so on. In easy words - all the low hanging fruits have been invented, now lie ahead the tougher challenges.

How did culture get affected? - Maslow's Hierarchy-style explanations are decently plausible... it's arguably the case that enough people in the US had climbed high enough on Maslow's Hierarchy by, say, 1970 that other considerations became focal

War - War has accelerated innovation a lot of times.

Role of govt- I also think that certain kinds of R&D are public goods and that they'll very likely be underprovided without deliberate mechanisms to address that (such as public funding of R&D), and I'm conceptually strongly supportive of those mechanisms. While I consider myself strongly pro-free market and pro-freedom, I am not a libertarian, and I think this is the kind of place where a traditionally libertarian approach simply doesn’t have much that’s useful to say. Supports govt as a buyer

What does imputed average elasticities mean?
I think that the industrial policy folks too often talk about "the returns to publicly-funded R&D" as a monolithic whole. I would ask them: exactly how will you choose who gets funded? What will the relevant incentive structure for those people be? What's your theory for how they'll do top-tier science/research/innovation/etc.? These are tough questions! Building systems that allocate capital well at scale and through time is hard. If they have good answers, I'm probably supportive... more experimentation would be great. If not, I'm less hopeful.

Tech actually reduces inequality - https://www.journals.uchicago.edu/doi/abs/10.1086/342055?journalCode=jole

Read More
Hide