Explain It Like I'm Five: AI parameters

What are AI parameters?

Parameters are connections between data in an AI model. Developers refine parameters to guide an AI’s behaviour — giving connections more or less weight tells the AI that certain data and actions are more important than others.

What do you mean by “weight”?

Fun fact: This is a digital version of how the connections between brain cells work. When your brain gets an input, it needs to have a certain “weight” or intensity to actually fire up the related cells. It’s why your arm only moves when you want it to, instead of shooting into the air every time you hear the word “arm.” Giving parameters a certain weight will keep an AI from, say, putting a cake recipe in the resume you told it to write.

Does more parameters mean a better AI?

In theory, more parameters mean an AI can handle more diverse tasks, recognize more complicated patterns, and reason better by making more data connections. But the saying in AI is “junk in, junk out.” It doesn’t matter how many parameters an AI has if they are trained on lousy data, or weighted so the model behaves in a way that’s not useful.

Is this related to tokens?

Parameters are connections between data, but tokens are units measuring pieces of data — how much it “knows.” You might hear tokens used alongside context windows, which are how much information an AI can “think about” at one time.

Is it just me, or are companies talking about parameters more?

Announcements over the last week about new AI models from Meta, Microsoft, and Apple were really focused on parameters. In Microsoft’s and Apple’s cases, that’s because the models are smaller so they can run directly on devices — the thinking is, if they don’t have space to cram as much info as possible into the AI, they’re better off making one that reasons better.