Pharmiweb ChannelsAll | PharmaCo | Clinical Research | R&D/BioTech | Sales/Mktg | Healthcare | Recruitment | Pharmacy | Medical Comms

Pharmiweb.com RSS Feed Pharmiweb.com RSS Feeds

Advertising

Press Release

Infinia ML Chief Scientist Is Most Prolific Researcher At NIPS 2017 With Ten Accepted Papers

Infinia ML
Posted on: 04 Dec 17
Infinia ML Chief Scientist Is Most Prolific Researcher At NIPS 2017 With Ten Accepted Papers Lawrence Carin, Ph.D, is the most-published author at the Thirty-First Annual Conference on Neural Information Processing Systems (NIPS), with several papers co-authored by Infinia ML founding member Ricardo Henao.

PR Newswire

DURHAM, N.C., Dec. 4, 2017

DURHAM, N.C., Dec. 4, 2017 /PRNewswire/ -- Infinia ML, an advanced machine learning company that delivers transformative automation solutions and predictive analytics to Fortune 500 businesses, announced today that the company's chief scientist, Lawrence Carin, Ph.D., published ten papers at this year's Annual Conference on Neural Information Processing Systems (NIPS). This makes Carin the most prolific researcher at the event, which brings together machine learning experts from institutions like Google, Microsoft, IBM, MIT, Stanford, and Carnegie Mellon. In fact, Carin and his co-authors contributed more papers than AI-focused organizations such as Amazon, Tencent, and OpenAI.

Carin's papers help advance the field of machine learning by showing that deep learning has important applications across a large range of domains, with important new ideas presented in text analysis, image synthesis, and analysis of dynamic local field potentials in the brain.

"This body of work underscores our commitment to leading this field in cutting-edge research," explained CEO Robbie Allen. "Lawrence is one of the most respected ML scientists in the world, and the team he brought to Infinia ML is unparalleled. Their research, covering areas including generative adversarial networks (GANs), deep learning, and neuroscience, is on display at NIPS this year, showing the exceptional capabilities of our growing team," added Allen.

Carin's work helps Duke University contribute a total of 14 papers, which is a comparable number of papers as IBM, Facebook, and Harvard. Infinia ML founding member Ricardo Henao co-authored four papers with Carin for NIPS.

"With only 21% of papers accepted this year at NIPS, publishing ten is a fantastic achievement and we are very proud to have Dr. Carin and Dr. Henao on our Infinia ML team," said Mike Salvino, Executive Chairman of Infinia ML.  "The talent of the team and the published research allows us to quickly deliver business impact to our Fortune 100 clients, specifically helping them decrease costs or increase their revenue."

About Infinia ML
Infinia ML empowers companies to make smarter decisions and automate complex business processes by leveraging the latest breakthroughs in machine learning. Infinia ML has a team of leading AI researchers and deep learning experts that have published hundreds of peer-reviewed papers through top machine learning conferences and journals. Backed by noted private equity firm Carrick Capital Partners, the Durham, North Carolina company is led by CEO Robbie Allen, an experienced AI entrepreneur, and Chief Scientist Lawrence Carin, Ph.D., the Duke University Vice Provost for Research and Professor of Electrical and Computer Engineering. Learn more online at www.InfiniaML.com.

View original content:http://www.prnewswire.com/news-releases/infinia-ml-chief-scientist-is-most-prolific-researcher-at-nips-2017-with-ten-accepted-papers-300565922.html

SOURCE Infinia ML

PR Newswire
www.prnewswire.com

Last updated on: 04/12/2017

Advertising
Site Map | Privacy & Security | Cookies | Terms and Conditions

PharmiWeb.com is Europe's leading industry-sponsored portal for the Pharmaceutical sector, providing the latest jobs, news, features and events listings.
The information provided on PharmiWeb.com is designed to support, not replace, the relationship that exists between a patient/site visitor and his/her physician.