• Welcome to the Online Discussion Groups, Guest.

    Please introduce yourself here. We'd love to hear from you!

    If you are a CompTIA member you can find your regional community here and get posting.

    This notification is dismissable and will disappear once you've made a couple of posts.
  • We will be shutting down for a brief period of time on 9/24 at around 8 AM CST to perform necessary software updates and maintenance; please plan accordingly!

Brianna White

Administrator
Staff member
Jul 30, 2019
4,655
3,454
The use of computer algorithms to differentiate patterns from noise in data is now commonplace due to advances in artificial intelligence (AI) research, open-source software such as scikit-learn, and large numbers of talented data scientists streaming into the field. There is no question that competency in computer science, statistics, and information technology can lead to a successful AI project with useful outcomes. However, there is a missing piece from this recipe for success which has important implications in some domains. It’s not enough to teach humans to think like AI. We need to teach AI to understand the value of humans.
Consider a recent peer-reviewed study from Google and several academic partners to predict health outcomes from the electronic health records (EHR) of tens of thousands of patients using deep learning neural networks. Google developed special data structures for processing data, had access to powerful high-performance computing, and deployed state-of-the-art AI algorithms for predicting outcomes such as whether a patient would be readmitted to the hospital following a procedure such as surgery. This was a data science tour de force.
Although Google’s top-level results in this study claimed to beat a standard logistic regression model, there was a meaningful distinction buried in the fine print. While Google beat a standard logistic regression model based on 28 variables, its own deep learning approach only tied a more detailed logistic regression model built from the same data set the AI had used. Deep learning, in other words, was not necessary for the performance improvement Google claimed. In this example, the AI did not meet expectations.
Continue reading: https://www.extremetech.com/extreme/333588-the-human-side-of-artificial-intelligence
 

Attachments

  • p0007491.m07139.machinelearningfeature_768x424.jpg
    p0007491.m07139.machinelearningfeature_768x424.jpg
    57.3 KB · Views: 33
  • Like
Reactions: Brianna White