Mixture Models

Mixture Models

  • Mixed membership models for time series. E. Fox and M. I. Jordan. arXiv:1309.3533, 2013.

  • Revisiting k-means: New algorithms via Bayesian nonparametrics. B. Kulis and M. I. Jordan. In J. Langford and J. Pineau (Eds.), Proceedings of the 29th International Conference on Machine Learning (ICML), Edinburgh, UK, 2012.

  • Bayesian multi-population haplotype inference via a hierarchical Dirichlet process mixture. E. P. Xing, K.-A. Song, M. I. Jordan, and Y. W. Teh. Proceedings of the 23rd International Conference on Machine Learning (ICML), 2006.

  • Hierarchical Dirichlet processes. Y. W. Teh, M. I. Jordan, M. J. Beal and D. M. Blei. Journal of the American Statistical Association, 101, 1566-1581, 2006. [Software].

  • Variational methods for the Dirichlet process. D. M. Blei and M. I. Jordan. Proceedings of the 21st International Conference on Machine Learning (ICML), 2004.

  • Bayesian haplotype inference via the Dirichlet process. E. P. Xing, R. Sharan, and M. I. Jordan. Proceedings of the 21st International Conference on Machine Learning (ICML), 2004.

  • Latent Dirichlet allocation. D. M. Blei, A. Y. Ng, and M. I. Jordan. Journal of Machine Learning Research, 3, 993-1022, 2003.

  • Modeling annotated data. D. M. Blei and M. I. Jordan. 26th International Conference on Research and Development in Information Retrieval (SIGIR), New York: ACM Press, 2003.

  • Learning in modular and hierarchical systems. M. I. Jordan and R. A. Jacobs. In M. Arbib (Ed.), The Handbook of Brain Theory and Neural Networks, 2nd edition. Cambridge, MA: MIT Press, 2002.

  • Learning with mixtures of trees. M. Meila and M. I. Jordan. Journal of Machine Learning Research, 1, 1-48, 2000.

  • Asymptotic convergence rate of the EM algorithm for gaussian mixtures. J. Ma, L. Xu, and M. I. Jordan. Neural Computation, 12, 2881-290, 2000.

  • Mixed memory Markov models: Decomposing complex stochastic processes as mixture of simpler ones. L. K. Saul and M. I. Jordan. Machine Learning, 37, 75-87, 1999.

  • Improving the mean field approximation via the use of mixture distributions. T. S. Jaakkola and M. I. Jordan. In M. I. Jordan (Ed.), Learning in Graphical Models, Cambridge: MIT Press, 1999.

  • Approximating posterior distributions in belief networks using mixtures. C. M. Bishop, N. Lawrence, T. S. Jaakkola, and M. I. Jordan. In Jordan, M. I., Kearns, M. J. and Solla, S. A. (Eds.), Advances in Neural Information Processing Systems (NIPS) 10, Cambridge, MA: MIT Press, 1998.

  • Estimating dependency structure as a hidden variable. M. Meila and M. I. Jordan. In Jordan, M. I., Kearns, M. J. and Solla, S. A. (Eds.), Advances in Neural Information Processing Systems (NIPS) 10, Cambridge, MA: MIT Press, 1998.

  • Mixture representations for inference and learning in Boltzmann machines. N. D. Lawrence, C. M. Bishop and M. I. Jordan. In G. F. Cooper and S. Moral (Eds.), Uncertainty in Artificial Intelligence (UAI), Proceedings of the Fourteenth Conference, San Mateo, CA: Morgan Kaufman, 1998.

  • Factorial hidden Markov models. Z. Ghahramani and M. I. Jordan. Machine Learning, 29, 245--273, 1997.

  • Mixture models for learning from incomplete data. Z. Ghahramani and M. I. Jordan. In Greiner, R., Petsche, T., and Hanson, S. J. (Eds.), Computational Learning Theory and Natural Learning Systems, Cambridge, MA: MIT Press, 1997.

  • Markov mixtures of experts. M. Meila and M. I. Jordan. In Murray-Smith, R., and Johansen, T. A. (Eds.), Multiple Model Approaches to Modelling and Control, London: Taylor and Francis, 1997.

  • On convergence properties of the EM Algorithm for Gaussian mixtures. L. Xu and M. I. Jordan. Neural Computation, 8, 129-151, 1996.

  • Markov mixtures of experts. M. Meila and M. I. Jordan. In D. Touretzky, M. Mozer, and M. Hasselmo (Eds.), Advances in Neural Information Processing Systems (NIPS) 8, MIT Press, 1996.

  • Boltzmann chains and hidden Markov Models. L. K. Saul and M. I. Jordan. In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.), Advances in Neural Information Processing Systems (NIPS) 7, MIT Press, 1995.

  • Learning in modular and hierarchical systems. M. I. Jordan and R. A. Jacobs. In M. Arbib (Ed.), The Handbook of Brain Theory and Neural Networks, Cambridge, MA: MIT Press, 1995.

  • Convergence results for the EM approach to mixtures of experts architectures. M. I. Jordan and L. Xu. Neural Networks, 8, 1409-1431, 1995.

  • An alternative model for mixtures of experts. L. Xu, M. I. Jordan, and G. E. Hinton. In G. Tesauro, D. S. Touretzky and T. K. Leen, (Eds.), Advances in Neural Information Processing Systems (NIPS) 7, Cambridge, MA: MIT Press, 1995.

  • Hierarchical mixtures of experts and the EM algorithm. M. I. Jordan and R. A. Jacobs. Neural Computation, 6, 181-214, 1994.

  • Supervised learning from incomplete data via the EM approach. Z. Ghahramani and M. I. Jordan. In Cowan, J., Tesauro, G., and Alspector, J., (Eds.), Advances in Neural Information Processing Systems 6, San Mateo, CA: Morgan Kaufmann, 1994.

  • A statistical approach to decision tree modeling. M. I. Jordan. In M. Warmuth (Ed.), Proceedings of the Seventh Annual ACM Conference on Computational Learning Theory, New York: ACM Press, 1994.

  • Learning from incomplete data. Z. Ghahramani and M. I. Jordan. MIT Center for Biological and Computational Learning Technical Report 108, 1994.

  • Theoretical and experimental studies of convergence properties of EM algorithm based on finite Gaussian mixtures. L. Xu and M. I. Jordan, M. I. Proceedings of the 1994 International Symposium on Artificial Neural Networks, Tainan, Taiwan, pp. 380--385, 1994.

  • Learning piecewise control strategies in a modular neural network architecture. R. A. Jacobs and M. I. Jordan. IEEE Transactions on Systems, Man, and Cybernetics, 23, 337--345, 1993. Journal of the Acoustical Society of America, 93, 2948--2961, 1993.

  • Supervised learning and divide-and-conquer: A statistical approach. M. I. Jordan, and R. A. Jacobs. In P. E. Utgoff, (Ed.), Machine Learning: Proceedings of the Tenth International Workshop, San Mateo, CA: Morgan Kaufmann, 1993.

  • Hierarchies of adaptive experts. M. I. Jordan and R. A. Jacobs. In J. Moody, S. Hanson, and R. Lippmann (Eds.), Advances in Neural Information Processing Systems (NIPS) 4, San Mateo, CA: Morgan Kaufmann, 1992.

  • A competitive modular connectionist architecture. R. A. Jacobs and M. I. Jordan. In D. Touretzky (Ed.), Advances in Neural Information Processing Systems (NIPS) 3, San Mateo, CA: Morgan Kaufmann, 1991.

  • A modular connectionist architecture for learning piecewise control strategies. R. A. Jacobs and M. I. Jordan. Proceedings of the 1991 American Control Conference, Boston, MA, pp. 343--351, 1991.

  • Adaptive mixtures of local experts. R. A. Jacobs, M. I. Jordan, S. Nowlan, and G. E. Hinton Neural Computation, 3, 1-12, 1991.