Brianna White

Administrator
Staff member
Jul 30, 2019
4,656
3,456
When it comes to AI, algorithmic innovations are substantially more important than hardware — at least where the problems involve billions to trillions of data points. That’s the conclusion of a team of scientists at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), who conducted what they claim is the first study on how fast algorithms are improving across a broad range of examples.
Algorithms tell software how to make sense of text, visual, and audio data so that they can, in turn, draw inferences from it. For example, OpenAI’s GPT-3 was trained on webpages, ebooks, and other documents to learn how to write papers in a humanlike way. The more efficient the algorithm, the less work the software has to do. And as algorithms are enhanced, less computing power should be needed — in theory. But this isn’t settled science. AI research and infrastructure startups like OpenAI and Cerberus are betting that algorithms will have to increase in size substantially to reach higher levels of sophistication.
The CSAIL team, led by MIT research scientist Neil Thompson, who previously coauthored a paper showing that algorithms were approaching the limits of modern computing hardware, analyzed data from 57 computer science textbooks and more than 1,110 research papers to trace the history of where algorithms improved. In total, they looked at 113 “algorithm families,” or sets of algorithms that solved the same problem, that had been highlighted as most important by the textbooks.
Continue reading: https://venturebeat.com/2021/09/20/improved-algorithms-may-be-more-important-for-ai-performance-than-faster-hardware/
 

Attachments

  • p0004846.m04518.big_data_e1628088752737.jpg
    p0004846.m04518.big_data_e1628088752737.jpg
    119.3 KB · Views: 51
  • Like
Reactions: Kathleen Martin