Leonardo Cotta

Logo

Computer Science PhD student @Purdue

View My GitHub Profile

About

Hi! I’m a PhD candidate in Computer Science @ Purdue University, where I’m advised by Prof. Bruno Ribeiro. Before that, I was a B.Sc. student (also in CS) @ UFMG, Brazil.

During my time as a B.Sc. student I worked with distributed algorithms (@ UFMG) and Quantum Computing Theory (@ University of Calgary).

My CV (October/2021) is here

The best way to read about my research is going to my GoogleScholar profile. I try to maintain my repositories here at Github as up to date as possible. If you don’t find what you are looking for here or if you have any extra questions about my research, drop me a line!

Research

I’m interested in understanding how machines can learn to reason with combinatorial and invariant data, e.g. graphs, sets and posets. In this context, I often use tools from statistical machine learning, causal inference, Monte Carlo sampling, combinatorics and abstract algebra.

Invariant representations of combinatorial data. Representations that incorporate the inherent invariances of combinatorial data result in learning algorithms with better generalization capabilities. I am interested in understanding how we can design representations of graphs, sets and posets that both incorporate the natural invariances present in the data and are expressive enough to distinguish different combinatorial objects. Finally, I’m also interested in better quantifying the effect of a specific invariance feature in the generalization performance of a learning algorithm.

Combinatorial representations of combinatorial data. A combinatorial object can be represented with its underlying combinatorial decompositions. For instance, we can describe a graph with its subgraphs, a set with its subsets or a poset with its chains. I’m interested in understanding how to represent each of this substructures and how they can be combined to represent the entire combinatorial object. We have already shown that for certain graph tasks such decompositions can lead to more powerful and robust learning algorithms. I want to understand in general what is the role of combinatorial decomposition in supervised learning, i.e. by how much generalization is impacted and under what conditions.

Learning to answer causal queries with combinatorial data. In real-world systems we are often presented with combinatorial data , e.g. networks, and can perform experiments to observe new outcomes. How can we learn to answer causal queries from such experiments? What does learning even mean in this context? What are possible causal models for combinatorial and invariant data? What is the role of invariant representations here? These are all new and exciting questions I have been working on over the last year or so.

Applications. Recommender systems, bioinformatics, ecology, computer networks and sports analytics.

News

I will be mentoring a project on poset representation learning at LOGML this year! Apply!

Accepted at NeurIPS 2021:

Reconstruction for Powerful Graph Representations

Leonardo Cotta, Christopher Morris, Bruno Ribeiro

I’ll be on an (remote) internship this summer (2021) at Intel Labs!

Accepted at NeurIPS 2020:

Unsupervised Joint -node Graph Representations with Compositional Energy-Based Models

Leonardo Cotta, Carlos H.C. Teixeira, Ananthram Swami, Bruno Ribeiro

Contact

cotta [at] purdue [dot] edu

Follow me on Twitter!

Service

LOG Conference 2022 Organizing Committee

NeurIPS 2022 TPC

AAAI 2021 TPC

SDM 2021 TPC