Print

Code Watch: Learning machine learning



Larry O Brien
Email
July 16, 2012 —  (Page 1 of 2)
Machine learning is, regrettably, not one of the day-to-day chores assigned to most programmers. However, with data volumes exploding, and high-profile successes such as IBM's Jeopardy-beating Watson and the recommendation engines of Amazon and Netflix, the odds are increasing that ML’s opportunity might knock on your door one day.

From the 1960s to the 1980s, the emphasis of artificial intelligence was in "top-down" approaches in which expertise from domain experts was somehow transcribed into a fixed set of rules and their relations. Often, these would be a series of small "if-then" rules, and the "magic sauce" of expert systems was that they could draw conclusions by automatically chaining together the execution of those rules whose "if" parameters were known. The technology for inferencing worked well enough, but it turned out that very large rulebases were hard to debug and maintain, while not very large rulebases didn't produce many compelling applications (for instance, my expert system for identifying seabirds failed to make me a billionaire).

The late 1980s saw a shift toward algorithms influenced by biological processes, and the rebirth of artificial neural networks (which were actually developed in the early 1960s), genetic algorithms, and such things as Ant and flocking algorithms. There was a flurry of interest in fuzzy logic, which was particularly well suited for control systems, as they provided continuous response curves.

The 1990s saw increasingly sophisticated algorithms in all these areas and began the march toward today's world of machine learning, with its emphasis on statistical techniques used against large datasets. Perhaps the most significant development was the invention of support vector machines, which provide a robust way to determine hyperplanes that effectively bisect high-dimensional solution spaces.

As that last sentence demonstrated, it doesn't take long before the techniques of AI and ML flirt with becoming unintelligible. A sentence can often be a confusing mashup of mathematics, jargon from AI's five-decade history, and flawed metaphor (artificial neural networks aren't a great deal like real-world neurons, and genetic algorithms don't have much in common with meiosis and DNA recombination). But while there is a temptation to use a technique as a black box, I strongly believe that sustained success requires gaining an intuition into the underlying technique. That intuition doesn't need to be an academic-level understanding of the mathematics, but it does need to be at a level where you can make reasonable guesses as to what type and volume of data you need, why and what kind of preprocessing you might need, and what problems are likely to crop up during processing.



Related Search Term(s): machine learning

Pages 1 2 


Share this link: http://sdt.bz/36797
 

close
NEXT ARTICLE
Netflix is building deep learning neural networks hosted on Amazon Web Services
The streaming company is running the artificial intelligence algorithms on GPUs in Amazon’s cloud Read More...
 
 
 




News on Monday  more>>
Android Developer News  more>>
SharePoint Tech Report  more>>
Big Data TechReport  more>>

   
 
 

 


Download Current Issue
APRIL 2014 PDF ISSUE

Need Back Issues?
DOWNLOAD HERE

Want to subscribe?