Abstract Our world is changing. Artificial intelligence is being employed in just about all walks of life – from virtual assistants to self-driving cars. How do we ensure the quality of these applications? Major advances have been made in developing applications that utilize some form of artificial intelligence, but there’s not nearly as much consideration … Read More
I was lucky enough last week to attend PHLAI, a Comcast-sponsored conference on machine learning and artificial intelligence. The dreary weather did not dampen our spirits as practitioners and business stakeholders met to discuss one of the most important trends in our lifetime.
I recently attended the O’Reilly AI Conference in New York where artificial intelligence practitioners showcased the impressive strides they’ve made so far in using AI for real-world applications
This talk draws an arc from Theory-Driven AI to Data-Driven AI and positions Watson along that trajectory. It proposes that to advance AI to where we all know it must go, we need to discover how to efficiently combine human cognition, massive data and logical theory formation. We need to boot strap a fluent collaboration between human and machine that engages logic, language and learning to enable machines to learn how to learn and ultimately deliver on the promise of AI.
In the last decade, a class of machine learning algorithms popularly know as “deep learning” have produced state-of-the-art results on a wide variety of domains, including image recognition, speech recognition, natural language processing, genome sequencing, and financial data among others. What is deep learning? Why has it become so popular so quickly? How can one fit deep learning into existing pipelines?