8 March 2021
Dr. Nikolas Provatas
Professor, Condensed Matter Physics, McGill University Canada Research Chair in Computational Materials Science Scientific Director, McGill High Performance Computing Centre (a Compute Canada node) Member, National Initiatives Committee, Compute Canada
Dr. Provatas works at the interface of condensed matter physics and materials science. He uses high-performance computing, principles of statistical mechanics and experiments to understand microstructure evolution (from the scale of about ten billionths of a metre to a hundredth of a millimetre) during material processing and how this can be scaled up for industrial applications. Many of these problems come from or are motivated directly from companies wanting to know how to optimize materials and their performance for certain applications.
The internal structure of materials influences how well things like aircraft, automobiles, bridges, buildings and other structures perform. For example, an automobile made with lighter alloys like aluminum or magnesium requires less metal, which saves manufacturing costs, and lighter vehicles use less gas, which saves consumers money.
Your background is in both material physics and engineering. How do you apply that experience to the research you do today?
I’m what you would call a computational material scientist. I create theories and then test these theories using computer modeling and simulations to understand how to design a better material or product from the atomic scale up. How matter selfassembles has a direct impact on how a material will function – how strong it will be, how ductile, how it will conduct electricity and how heat will pass through it. This is fundamental physics that will help industry produce smarter materials faster.
Can you provide an example?
An airplane wing has to be very strong, but very lightweight. That means using specially treated aluminum versus very heavy steel. A semiconductor has to have a certain crystal structure to exhibit the electrical conductivity needed to power modern electronics. Defects in the material can compromise that performance. So understanding how the material is processed and structured will influence how it will perform in the real world. This is a very big issue for manufacturers and product developers.
Compute Canada is important to your research?
Compute Canada is like my virtual lab. It provides most of the cycles and storage for my work and my students’ work. Turning an idea into innovation means being able to scale up problems using parameters and system sizes that are relevant to industry. It’s not possible to do that on a desktop computer.
Is any of your research funded by industry?
About half is presently. For example, I am working with Novelis (which has its global technology centre based in Kingston) to understand microstructure and phase development in aluminum sheet. By linking the microstructure to the cooling and chemistry, Novels hopes to learn how to optimize their process to produce a better material. I also have a new project with IBM, as part of the MiQro Innovation Collaborative Centre (C2Mi), to understand how circuit boards packaged at their Bromont facility can be designed at the microstructure scale to hold up against stresses during the manufacturing process. Again, the linchpin is being able to run this modeling off the Compute Canada platform to produce results for these folks.
What is the main benefit to using computer modeling?
Computational models are cheaper, less time consuming and more efficient than trial and error experiments in the lab. With trial and error, you never know for sure if you’ve hit the jackpot, if the material you’ve chosen is any better than what you started with. Simulation will make it possible to go into a virtual laboratory, synthesize virtual materials and processes using Compute Canada resources and then go to the plant floor to manufacture a better semiconductor, advanced steel or aluminum or even biomaterial.
What are your responsibilities as the Scientific Site Director at the Compute Canada node at McGill University?
I see my role as directing the scientific mission of the (McGill HPC) centre. Researchers care about the science that needs this computing. They want the computing itself to be easy to access, efficient and reliable for the work they do. We’re the service providers that make that happen. We do the dirty work, whether it’s setting up the software, the right kind of storage, the right compute configuration. We also need to be looking at where big science fields like genomics, materials science, neurology, aerodynamics, etc., are heading and pre-emptively design our machine and teams to serve niche areas that are going to have a major impact on the economy and society.
Brain research is another science that relies on powerful computational resources. How are Compute Canada and the McGill HPC Centre facilitating advances in this important field?
Dr. Allan Evans here at McGill is the only Canadian participant of an international team of researchers internationally participating in the Human Brain Project (a $1.6-billion initiative to translate the complexities of the human brain into a multilayered supercomputer simulation). He also leads the CANARIE-funded CBRAIN project which is developing a platform for distributed processing and sharing of 3D/4D brain imaging data across five Canadian research centres. Both projects would use Compute Canada centres to ensure doctors and researchers have easy access to the data and can translate it into practical knowledge about brain development and disease.
This computing infrastructure is located at universities. How do you see industry benefiting?
There are three main benefits. First, collaborating with academia means turning $1 in industry investment into $3 or $4 when matched by one of several generous Canadian and provincial matching programs. You’d have to be insane as an industrialist to not take advantage of that. In return, industry directly benefits from the academics’ research, which relies on Compute Canada resources. That could be learning to design a better strength-to-weight alloy, a better liquid crystal or a better polymer. Then there are the students who are trained to exploit the critical link between HPC, basic research and real-world applications. About half of my students have gone to industry or industry-related jobs. They are providing the foundation for more tech-savvy industries that make computing part of their existence.
In addition to the computing infrastructure, how do you see Compute Canada helping to advance this sector?
We’re still a long way from being able to virtually engineer — from the microstructure up — materials with a certain properties to serve a targeted application. To get there, first we need a culture of education that sees modelling as part of a bigger whole. Compute Canada has a vital role in terms of teaching about what HPC is, how to use HPC and how to integrate it into research and development as an enabling tool. There also has to be a coordination nationally for a platform that ensures researchers can easily access, push and analyze data, and work collaboratively on data without having to know the technical nitty gritty stuff. That’s where I see Compute Canada’s role becoming even more important in the future.