Will AI Replace Radiologists?

Originally posted on Über-Coder:
Ever since Geoffrey Hinton, who is considered the father of deep learning, said in a conference of radiologists that the field will soon be taken over by AI(Artificial Intelligence), a huge debate has erupted among radiologists and AI experts, whether this is a possibility in the foreseeable future or not. If…

Über-Coder

artificial-intelligence-698122_960_720

Ever since Geoffrey Hinton, who is considered the father of deep learning, said in a conference of radiologists that the field will soon be taken over by AI(Artificial Intelligence), a huge debate has erupted among radiologists and AI experts, whether this is a possibility in the foreseeable future or not.
If we look closely, the answer is clearly; No. Radiologists will not be replaced by AI systems. However, the nature of the job of radiologists will change.
The reason why there is confusion in the first place is that radiology(majorly) is a branch of medicine. And medicine is largely misunderstood beyond the practitioners of this field. Furthermore, there are very few people in the world who have a good knowledge of both AI and medicine and also statistical inference associated with it.
In classical texts, medical diagnosis is always described as a protocol where a set of procedures are followed…

View original post 1,019 more words

Sample Size Estimation for Machine Learning Models Using Hoeffding’s Inequality

Originally posted on Über-Coder:
Wassily Hoeffding (1914 -1991) was one of the founding fathers of non- parametric statistics (picture credit: http://stat-or.unc.edu) Deep learning is the talk of town these days and with advent of frameworks like Tensorflow, Keras and SciKitlearn etc. anyone can implement it with ease. This is why the first hunch of everyone…

Über-Coder

hoeffding
Wassily Hoeffding (1914 -1991) was one of the founding fathers of non- parametric statistics (picture credit: http://stat-or.unc.edu)

Deep learning is the talk of town these days and with advent of frameworks like Tensorflow, Keras and SciKitlearn etc. anyone can implement it with ease. This is why the first hunch of everyone when dealing with data is to someway apply deep learning to it or at-least some form of machine learning. However, what most of us don’t realize is that; to have a theoretical guarantee over learning and and then testing in such a way that error is minimized when the model is deployed in the real-world, we need considerably large data sets. And such large data sets are very hard to get.

This theoretical guarantee is of utmost importance when dealing with medical or health related data because to generate confidence intervals (values between which your point predictors in-sample can…

View original post 467 more words