Philosophy of science, statistics, and machine learning all recommend the selection of simple theories or models on the basis of empirical data, where simplicity has something to do with minimizing independent entities, principles, causes, or equational coefficients. This intuitive preference for simplicity is called Ockham's razor, after the fourteenth century theologian and logician William of Ockham. But in spite of its intuitive appeal, how could Ockham's razor help one find the true theory when the truth might be complex? This project, involving Kevin Kelly, Conor Mayo-Wilson, and Hanti Lin, concerns the development and extension of a new answer to that question, called the Ockham efficiency theorem. The idea is that it is hopeless to provide an a priori explanation how simplicity points at the true theory immediately, since the truth may depend upon subtle empirical effects that have not yet been observed. But Ockham's razor nonetheless guarantees a priori to keep one on the straightest possible path to the truth, allowing for unavoidable twists and turns along the way as new effects are discovered. The work is currently supported by NSF grant #0740681.
Algebraic set theory uses the methods of category theory to study elementary set theory. This website collects together current research in algebraic set theory to promote further development of the subject.
AProS (Automated Proof Search) is a theorem prover that aims to find normal natural deduction proofs of theorems in sentential and predicate logic. It does so efficiently for minimal intuitionist and classical versions of first-order logic. The work on AProS is part of the broader educational AProS Project.
Paul Bernays was arguably the greatest philosopher of mathematics in the twentieth century, yet much of his work remains untranslated from the original German or French and this is now almost inaccessible to the broader community. This project is very close to completing work on two volumes of a bilingual edition, that is focused on Bernay's paper in philosophy of mathematics.
Causal Judgment in Cognitive and Developmental Psychology
David Danks, Clark Glymour, and Richard Scheines have helped to adapt the causal Bayes net framework to model human learning, an enterprise that is now continued by a number of psychologists across the nation. Danks, Glymour, and Scheines previously participated in the James S. McDonnell Foundation Causal Learning Collaborative that brought together psychologists, philosophers, and computer scientists from multiple institutions to investigate the representations and processes underlying causal learning and judgment in adults and children. That research continues in multiple collaborations that provide opportunities for graduate students and post doctoral fellows.
Causal Learning from Complex Time Series Data
In collaboration with Sergey Plis (Mind Research Network; University of New Mexico), David Danks is exploring our ability to learn causal structure from complex time series data. In particular, they focus on cases in which either the system is measured too slowly (“undersampled” data), or causally important variables are omitted from the time series. These problems are characteristic of neuroimaging, their focal application domain. This research is currently supported by the NSF.
This program lets students of statistics and causal reasoning learn about creating and manipulating statistical models. Through its intuitive interface, downloadable lessons teach about interpreting data, methods of discovering causal structure, and other statistical methods. The Causality Lab is incorporated into the Online Learning Initiative's Causal and Statistical Reasoning Course.
The Collected Works of Rudolf Carnap will be the first complete edition of Carnap's published philosophical writings. It will be published by the Open Court Publishing Company under the General Editorship of Professor Richard Creath and an editorial board including leading philosophers, logicians and mathematicians.
The Computational Systems Biology Group is an association of statisticians, computer scientists and biologists at Carnegie Mellon University, the University of Pittsburgh and the University of West Florida Institute for Human and Machine Cognition. It investigates statistical, algorithmic, experimental design and biological issues surrounding the interpretation of expression data, especially with SAGE and microarray techniques.
Formal Verification and Automated Reasoning (contact: Jeremy Avigad)Formal verification refers to the use of logic-based computational methods to verify the correctness of hardware and software, as well as the correctness of mathematical proofs.
- Interactive Theorem Proving. Potential projects include extending the mathematical library of the Lean interactive theorem prover, or developing educational materials based on that library.
- Automated Reasoning. Explore the use of automated theorem provers and constraint solvers in solving mathematical problems.
Full Circle: Publications of the Archive of Scientific Philosophy, Hillman Library, University of Pittsburgh
Under the general editorship of Steve Awodey, Full Circle publishes original, hitherto unpublished documentary material from the Archive of Scientific Philosophy and related collections around the world, as well as monographs and collections of essays on subjects related to papers contained in these archives, spanning the entire history of philosophy, science, and scientific philosophy in the nineteenth and twentieth centuries.
Hilbert's reputation as one of the greatest mathematicians is well established, yet many of his deepest ideas are found only in lectures that were never formally published. Over the course of six volumes, this project will present the most important of Hilbert's unpublished writings on the foundations of mathematics and of physics. Three of the six volumes have been published by Springer. Of particular significance for the foundations of mathematics are volume I (on geometry) and volumes II and III (on logic and arithmetic). Volumes I and III have been completed, whereas volume II is being prepared now.
Homotopy Type Theory refers to a new interpretation of Martin-Löf’s constructive type theory into abstract homotopy theory, which was pioneered at CMU by Awodey and his students. Logical constructions in type theory correspond to homotopy-invariant constructions on spaces, while theorems and even proofs in the logical system inherit a homotopical meaning. Homotopy type theory is closely tied to the new Univalent Foundations of Mathematics program, initiated by Vladimir Voevodsky, for a comprehensive, computational foundation for mathematics based on the homotopical interpretation of type theory. CMU hosts an active research group in this area.
Argument maps are diagrams that display the structure of an argument. By combing pictures and words, argument maps help people create better arguments and analyses. iLogos allows for the easy construction and sharing of argument maps.
This website teaches computational epistemology intuitively using animated and interactive explanations. Developed by Seth Casana, the first lesson available presents Kevin Kelly's research on Ockham's Razor and how it applies to theory choice.
Through the Open Learning Initiative, Carnegie Mellon offers accessible, high quality online education. It is also an ideal environment to enrich our understanding of effective online teaching techniques. The philosophy department has developed three of the available courses. The first, Logic and Proofs, is an introduction to modern symbolic logic that incorporates the Carnegie Proof Lab. The second, Causal and Statistical Reasoning, is comparable to a full semester course on causal and statistical reasoning taught at Carnegie Mellon University. *The third, Argument Diagramming, is a mini-course introduction to reconstructing and visually representing arguments.
Under the direction of Robert Cavalier, the department's Digital Media Lab for Applied Ethics and Political Philosophy is collaborating with Jim Fishkin's Center for Deliberative Democracy (Stanford) on developing online tools for Deliberative Polls. Called Project PICOLA, these tools provide the next-generation of computer mediated communication software for online structured dialogue and deliberation.
TETRAD is a freeware program for creating and manipulating causal/statistical models. Its intuitive interface and powerful algorithms allow for data prediction and model discovery without the need for cumbersome programming or extensive statistical sophistication.
Watched: Government Surveillance and Privacy in America
Watched delivers an enjoyable online learning experience designed to educate Americans about contemporary threats to personal privacy. It was produced by the students in the philosophy department's Morality Play course in collaboration with a student team from the Entertainment Technology Center. The course explores the potential of interactive media to facilitate values education, and in the spring of 2014, it resulted in this cutting edge educational prototype. Watched will provoke you to both think and care about personal privacy, and open your eyes to the role interactive media can play in educating moral sensibilities.