The Queue

Share this post

šŸ‘¾ The Queue #27

thequeue.substack.com

šŸ‘¾ The Queue #27

Do neural net androids dream of electric analogies?

Taylor Clauson
Dec 5, 2020
Share this post

šŸ‘¾ The Queue #27

thequeue.substack.com

Short reads āš”ļø

Snowflake is crushing it, and public investors are clamoring to buy more after its recent quarterly earnings. At a surface level, it’s easy to look at Snowflake and say, ā€œoh, it is just a new data warehouse.ā€ It’s not a false statement, but it misses what’s special about Snowflake. Michael Malis, one of the founders at Freshpaint (an Abstraction portfolio company), does a great job of breaking down why it’s exciting. Freshpaint Blog


There was recently an intriguing back-and-forth between Matt Biilmann (founder of Netlify) and Matt Mullenweg (founder of Automattic, the company behind WordPress) on the merits and tradeoffs between WordPress (a very monolithic approach to building a content-based website/app) and the Jamstack (a very decoupled, composable approach to a content-based website/app). It’s worth reading this Netlify post to grok the ideas behind the two philosophies. There are cases when both make sense, but I would prefer to bet on the flexible/composable approach offered by the Jamstack. Netlify Blog


Long read šŸ¤–

I, too, am tired of reading about GPT-3. That said, this excellent post by Melanie Mitchell puts the model through its paces with a variant of Douglas Hofstadter’s Copycat system to see whether it can make coherent analogies. The results are really fascinating… tl;dr in the quote below (emphasis mine):

The program’s performance was mixed. GPT-3 was not designed to make analogies per se, and it is surprising that it is able to do reasonably well on some of these problems, although in many cases it is not able to generalize well. Moreover, when it does succeed, it does so only after being shown some number of ā€œtraining examplesā€. To my mind, this defeats the purpose of analogy-making, which is perhaps the only ā€œzero-shot learningā€ mechanism in human cognition — that is, you adapt the knowledge you have about one situation to a new situation. You (a human, I assume) do not learn to make analogies by studying examples of analogies; you just make them. All the time.

The potential magic of GPT-3 (or any new tech) isn’t that it’s excellent at everything. It is that ā€œit’s able to do reasonably wellā€ at many, many things. Lowering the barriers to accomplishing work and improving the initial quality are two constant drumbeats in technological innovation.

Zooming out and using GPT-3 as a placeholder for any possibly disruptive technology, it’s a case study in a potentially disruptive step-function improvement. After all, these often look like toys in the beginning.

Graphic I love šŸŽØ

Randall Munroe of XKCD may be the most underrated genius of this generation. His comics and books are delightfully on-point. As much as I believe in the power of open-source and think that the fragmentation in modern software architecture is not inherently bad, it definitely comes with tradeoffs.

As usual, Munroe’s comics have a kernel of truth in them - there was a day a few years ago when a tiny (as in 11 lines of code) javascript package was removed from npm (the go-to JS package manager), and it literally broke a nontrivial portion of the internet.

Wikipedia rabbit hole šŸ“–

Spherical Cows. This is such a beautiful, hilarious, and widely-applicable metaphor. I’m fond of trying to reduce ideas down to their component assumptions because then it’s easier to sort those assumptions into categories like ā€œI believe that,ā€ ā€œI am not sure but could be convinced of that,ā€ and ā€œnope, no way.ā€ If you find a spherical cow embedded in one of those assumptions, it’s time to revisit.

Parting thought šŸ¤”

Back to your Econ 101 class, ceteris paribus is typically translated as ā€œall other things equal,ā€ but maybe ā€œassuming a spherical cow in a vacuumā€ is a plausible alternative?

Share this post

šŸ‘¾ The Queue #27

thequeue.substack.com
TopNew

No posts

Ready for more?

Ā© 2023 Taylor Clauson
Privacy āˆ™ Terms āˆ™ Collection notice
Start WritingGet the app
SubstackĀ is the home for great writing