Graphic is from the paper “Learning Compact Representations of Neural Networks using DiscriminAtive Masking (DAM)” 

Jie Bu, a Ph.D. student in computer science, has been interested in machine learning since he was an undergraduate in communications engineering at Harbin Institute of Technology, China. There he was introduced to the Random Forests (a machine learning model) and genetic algorithms which, Bu said, still hold great fascination for him.

In his current research at the Sanghani Center, Bu uses machine learning for physical applications. 

“We have been exploring how we can use machine learning to help fluid dynamics and quantum physics. From knowledge of science developed over the centuries, we are seeking to find out how machine learning models can be made more interpretable and generalizable,” said Bu.

“Sometimes generating simulation data is very slow so we are looking at the possibility of using machine learning to accelerate the simulation,” he said. “Machine learning  is very powerful and can be used to greatly benefit science discovery.” 

Bu is also interested in improving the efficiency of deep learning in a number of ways, including better model architecture and network pruning. 

A paper with his advisor, Anuj Karpatne focuses on model architecture. Bu presented their work on “Quadratic Residual Networks: A New Class of Neural Networks for Solving Forward and Inverse Problems in Physics Involving PDEs” last Spring in proceedings of the 2021 SIAM International Conference on Data Mining (SDM). They developed a new class of quadratic residual networks offering better accuracy, parameter efficiency, and convergence speed for solving forward and inverse problems in physics involving partial differential equations (PDEs).

At the Sanghani Center, Bu said, he has been able to team up and “meet with a lot of brilliant minds.” At the 2021 Neural Information Processing Systems (NeurlPS) conference in December, he presented “Learning Compact Representations of Neural Networks using DiscriminAtive Masking (DAM),” collaborative work with his advisor and other Ph.D. students at the Sanghani Center that uses network pruning.

Bu was also on the research team for the paper, “PhyNet: Physics Guided Neural Networks for Particle Drag Force Prediction in Assembly,” published both in proceedings at SDM 2020 and in Big Data Journal. 

Projected to graduate in summer 2023, Bu would like to continue his research in an industry aligned with his research direction.