人工智能资料库:第71辑(20171015)

作者:chen_h
微信号 & QQ:862251340
微信公众号:coderpai


1.【博客】The 10 Algorithms Machine Learning Engineers Need to Know

简介:

It is no doubt that the sub-field of machine learning / artificial intelligence has increasingly gained more popularity in the past couple of years. As Big Data is the hottest trend in the tech industry at the moment, machine learning is incredibly powerful to make predictions or calculated suggestions based on large amounts of data. Some of the most common examples of machine learning are Netflix’s algorithms to make movie suggestions based on movies you have watched in the past or Amazon’s algorithms that recommend books based on books you have bought before.

原文链接:https://gab41.lab41.org/the-10-algorithms-machine-learning-engineers-need-to-know-f4bb63f5b2fa


2.【博客】A news recommendation engine driven by collaborative reader behavior

简介:

Yuan Huang* is an Insight alum from the Summer 2017 session of *Insight Data Science* in New York. Yuan is completing a PhD in Computational Condensed Matter Physics at the University of Massachusetts at Amherst. In this article, Yuan describes how she combined collaborative filtering with content-based filtering to develop a recommendation engine for news articles based on user behavior.*

原文链接:https://blog.insightdatascience.com/news4u-recommend-stories-based-on-collaborative-reader-behavior-9b049b6724c4


3.【博客】Bayesian Nonparametrics

简介:

Bayesian Nonparametrics is a class of models with a potentially infinite number of parameters. High flexibility and expressive power of this approach enables better data modelling compared to parametric methods.

Bayesian Nonparametrics is used in problems where a dimension of interest grows with data, for example, in problems where the number of features is not fixed but allowed to vary as we observe more data. Another example is clustering where the number of clusters is automatically inferred from data.

The Statsbot team asked a data scientist, Vadim Smolyakov, to introduce us to Bayesian Nonparametric models. In this article, he describes the Dirichlet process along with associated models and links to their implementations.

原文链接:https://blog.statsbot.co/bayesian-nonparametrics-9f2ce7074b97


4.【博客】Experiments with a new kind of convolution

简介:

人工智能资料库:第71辑(20171015)

There are many things that I don’t like about convolution. The biggest of them all, most of the weights particularly in later layers are quite close to zero. This tells that most of these weights haven’t learned anything and don’t help the network process any new information.

So I wanted to modify the convolution op to solve this problem. This blog post highlights the experiments I did in that direction and the results.

原文链接:https://medium.com/towards-data-science/experiments-with-a-new-kind-of-convolution-dfe603262e4c


5.【博客】Recurrent Neural Networks for Email List Churn Prediction

简介:

Not very long after finishing writing my lessons learned from building a Hello World Neural Network I thought that I could move on from a simple MLP to a more sophisticated neural net. Probably it was for Karpathy’s blog post about the unreasonable effectiveness of Recurrent Neural Networks that I chose to continue with an RNN.

To be honest, this wasn’t my only motive. Those who read my posts will already know that one of the problems I have studied extensively during the past few months, is the mailing list churn prediction using data from MailChimp. They were a series of posts in which I covered:

  1. How to Predict Churn: When Do Email Recipients Unsubscribe?
  2. How to Predict Churn: A model can get you as far as your data goes
  3. Predicting Email Churn with NBD/Pareto
  4. Recurrent Neural Networks for Email List Churn Prediction (This post)

原文链接:https://www.blendo.co/blog/recurrent-neural-networks-email-churn-prediction/