scispace - formally typeset
Search or ask a question
Institution

Carnegie Mellon University

EducationPittsburgh, Pennsylvania, United States
About: Carnegie Mellon University is a education organization based out in Pittsburgh, Pennsylvania, United States. It is known for research contribution in the topics: Computer science & Robot. The organization has 36317 authors who have published 104359 publications receiving 5975734 citations. The organization is also known as: CMU & Carnegie Mellon.


Papers
More filters
Journal ArticleDOI
TL;DR: In this paper, the problem of estimating the graph associated with a binary Ising Markov random field is considered, where the neighborhood of any given node is estimated by performing logistic regression subject to an l 1-constraint.
Abstract: We consider the problem of estimating the graph associated with a binary Ising Markov random field. We describe a method based on l1-regularized logistic regression, in which the neighborhood of any given node is estimated by performing logistic regression subject to an l1-constraint. The method is analyzed under high-dimensional scaling in which both the number of nodes p and maximum neighborhood size d are allowed to grow as a function of the number of observations n. Our main results provide sufficient conditions on the triple (n, p, d) and the model parameters for the method to succeed in consistently estimating the neighborhood of every node in the graph simultaneously. With coherence conditions imposed on the population Fisher information matrix, we prove that consistent neighborhood selection can be obtained for sample sizes n=Ω(d3log p) with exponentially decaying error. When these same conditions are imposed directly on the sample matrices, we show that a reduced sample size of n=Ω(d2log p) suffices for the method to estimate neighborhoods consistently. Although this paper focuses on the binary graphical models, we indicate how a generalization of the method of the paper would apply to general discrete Markov random fields.

776 citations

Journal ArticleDOI
TL;DR: Donnees sur l'adaptation de la methode de Monte Carlo, utilisee en mecanique statistique, pour etudier la dynamique de segregation par dimensions des particules solides.
Abstract: When a can containing one large ball and a number of smaller ones is shaken, the large ball rises to the top, even when the larger ball is more dense than the others. Similarly, a mixture of different sized particles will segregate by size when shaken. An adaptation of the Monte Carlo method is used to study this size segregation. The results show the local, geometric mechanism by which the segregation is produced. Segregation by size is to be distinguished from the more obvious sifting process which occurs when tiny grains filter down through the interstices between large particles.

776 citations

Journal ArticleDOI
TL;DR: This paper presents a meta-modelling system that automates the very labor-intensive and therefore time-heavy and therefore expensive and expensive process of manually cataloging and cataloging medical equipment for use in the health care system.
Abstract: John A. Swets, Robyn M. Dawes, and John Monahan BBN Technologies (emeritus), Cambridge, Massachusetts; Radiology Department, Brigham and Women’s Hospital, and Department of Health Care Policy, Harvard Medical School, Boston, Massachusetts, Department of Social and Decision Sciences, Carnegie Mellon University, Pittsburgh, Pennsylvania, and School of Law, University of Virginia, Charlottesville, Virginia

774 citations

Proceedings Article
08 Jul 2010
TL;DR: The expressiveness of the GraphLab framework is demonstrated by designing and implementing parallel versions of belief propagation, Gibbs sampling, Co-EM, Lasso and Compressed Sensing and it is shown that using GraphLab the authors can achieve excellent parallel performance on large scale real-world problems.
Abstract: Designing and implementing efficient, provably correct parallel machine learning (ML) algorithms is challenging. Existing high-level parallel abstractions like MapReduce are insufficiently expressive while low-level tools like MPI and Pthreads leave ML experts repeatedly solving the same design challenges. By targeting common patterns in ML, we developed GraphLab, which improves upon abstractions like MapReduce by compactly expressing asynchronous iterative algorithms with sparse computational dependencies while ensuring data consistency and achieving a high degree of parallel performance. We demonstrate the expressiveness of the GraphLab framework by designing and implementing parallel versions of belief propagation, Gibbs sampling, Co-EM, Lasso and Compressed Sensing. We show that using GraphLab we can achieve excellent parallel performance on large scale real-world problems.

772 citations

Journal ArticleDOI
TL;DR: An adaptive statistical language model is described, which successfully integrates long distance linguistic information with other knowledge sources, and shows the feasibility of incorporating many diverse knowledge sources in a single, unified statistical framework.

771 citations


Authors

Showing all 36645 results

NameH-indexPapersCitations
Yi Chen2174342293080
Rakesh K. Jain2001467177727
Robert C. Nichol187851162994
Michael I. Jordan1761016216204
Jasvinder A. Singh1762382223370
J. N. Butler1722525175561
P. Chang1702154151783
Krzysztof Matyjaszewski1691431128585
Yang Yang1642704144071
Geoffrey E. Hinton157414409047
Herbert A. Simon157745194597
Yongsun Kim1562588145619
Terrence J. Sejnowski155845117382
John B. Goodenough1511064113741
Scott Shenker150454118017
Network Information
Related Institutions (5)
Massachusetts Institute of Technology
268K papers, 18.2M citations

95% related

University of Maryland, College Park
155.9K papers, 7.2M citations

93% related

University of Illinois at Urbana–Champaign
225.1K papers, 10.1M citations

93% related

IBM
253.9K papers, 7.4M citations

93% related

Princeton University
146.7K papers, 9.1M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023120
2022499
20214,981
20205,375
20195,420
20184,972