in

Quanta Magazine

In the machine learning world, the sizes of artificial neural networks — and their outsize successes — are creating conceptual conundrums. When a network named AlexNet won an annual image recognition competition in 2012, it had about 60 million parameters. These parameters, fine-tuned during training, allowed AlexNet to recognize images that it had never seen before. Two years later, a network named VGG wowed the competition with more than 130 million such parameters. Some artificial…


Report

What do you think?

423 Points
Upvote Downvote

Leave a Reply

Your email address will not be published. Required fields are marked *

Fiona Hill says January 6 was a “dress rehearsal” for future political violence

Facebook permanently banned a developer after he made an app to let users delete their news feed