Why does JavaScript's .sort() Function Compare Digit by Digit Instead of Whole Numbers?

When you sort the array [256, 378, 14, 67, 89, 45] using .sort() in JavaScript, the output is [14, 256, 378, 45, 67, 89] instead of [14, 45, 67, 89, 256, 378].

Why is this the case?

This happens because the default behavior of the .sort() method is to convert elements into strings and then compare them based on their UTF-16 code unit values. This behavior is perfectly fine when sorting an array of strings, but it leads to unexpected results when sorting an array of numbers.

In Unicode, each digit has a specific code point: '0' is 48, '1' is 49, and so on up to '9' which is 57. When comparing '378' and '45' as strings, the .sort() method starts by comparing the first character of each string. '3' (from '378') has a code point of 51, and '4' (from '45') has a code point of 52. Since 51 is less than 52, '378' is considered smaller than '45', and our array gets sorted as ['378', '45'].

This is why, when sorting numbers, it's recommended to provide a compare function to the .sort() method to tell it how to sort the numbers.

const arr = [256, 378, 14, 67, 89, 45];
arr.sort((a, b) => a - b);

This will correctly sort the array numerically and give you the desired output.

Get my free, weekly JavaScript tutorials

Want to improve your JavaScript fluency?

Every week, I send a new full-length JavaScript article to thousands of developers. Learn about asynchronous programming, closures, and best practices — as well as general tips for software engineers.

Join today, and level up your JavaScript every Sunday!

Thank you, Taha, for your amazing newsletter. I’m really benefiting from the valuable insights and tips you share.

- Remi Egwuda