Brianna White

Administrator
Staff member
Jul 30, 2019
4,656
3,456
Deep neural networks, a type of Artificial Intelligence began outperforming standard algorithms 10 years ago.
The majority of artificial intelligence (AI) is a game of numbers. Deep neural networks, a type of AI that learns to recognize patterns in data, began outperforming standard algorithms 10 years ago because we ultimately had enough data and processing capabilities to fully utilize them.
Today’s neural nets are even more data and power-hungry. Training them necessitates fine-tuning the values of millions, if not billions, of parameters that define these networks and represent the strength of interconnections between artificial neurons. The goal is to obtain near-ideal settings for them, a process called optimization, but teaching the networks to get there is difficult.
Getting Hyper
At the moment, the most effective approaches for training and improving convolutional neural networks are variations on a process known as stochastic gradient descent (SGD). Training entails reducing the network’s errors on a certain job, such as image recognition. An SGD method processes a large amount of labeled data in order to alter the network’s settings and reduce errors, or loss. Gradient descent is the repeated process of descending from high loss function values to some lower limit that indicates good enough (or, in some cases, the best feasible) parameter values.
Continue reading: https://www.analyticsinsight.net/ai-that-builds-ai-self-creation-technology-is-taking-a-new-shape/
 

Attachments

  • p0007636.m07282.ai_that_builds_ai_self_creation_technology_is_taking_a_new_shap.jpg
    p0007636.m07282.ai_that_builds_ai_self_creation_technology_is_taking_a_new_shap.jpg
    88 KB · Views: 39
  • Like
Reactions: Brianna White