Closures are confusing because they are an “invisible” concept.

When you use an object, a variable, or a function, you do this intentionally. You think: “I’m gonna need a variable here,” and add it to your code.

Closures are different. By the time most people approach closures, they have already used them unknowingly many times — and it is likely that this is true for yourself, too. So learning closures is less about understanding a new concept and more about recognizing something you have already been doing for a while.

GPT-3

GPT-3 is a neural-network-powered language model. A language model is a model that predicts the likelihood of a sentence existing in the world.

The GPT-3 model architecture itself is a transformer-based neural network. This architecture became popular around 2–3 years ago, and is the basis for the popular NLP model BERT and GPT-3’s predecessor, GPT-2. From an architecture perspective, GPT-3 is not actually very novel!

IT’S REALLY BIG. I mean really big. With 175 billion parameters, it’s the largest language model ever created

You can ask GPT-3 to be a translator, a programmer, a poet, or a famous author, and it can do it with its user (you) providing fewer than 10 training examples. Damn.

Today, GPT-3 is in private beta, but boy can I not wait to get my hands on it.