Search Results for "-㎞구글-찌라시-@hhu9999-부산달리기1페이지노출-.mvn"

Deep diffusion-based forecasting of COVID-19 by incorporating network-level mobility information

Bridging the Gap between Spatial and Spectral Domains: A Unified Framework for Graph Neural Networks

Reducing Noise Pixels and Metric Bias in Semantic Inpainting on Segmentation Map

Jianfeng He, Bei Xiao, Xuchao Zhang, Shuo Lei, Shuhui Wang, Chang-Tien Lu: Reducing Noise Pixels and Metric Bias in Semantic Inpainting on Segmentation Map.ICCVW 2021: 1876-1885

DIGDUG: Scalable Separable Dense Graph Pruning and Join Operations in MapReduce

Towards Semantically-Rich Spatial Network Representation Learning via Automated Feature Topic Pairing

Traces of Time through Space: Advantages of Creating Complex Canvases in Collaborative Meetings

You Lu

You Lou was a Ph.D. student in the Department of Computer Science. He was co-advised by Bert Huang and Naren Ramakrishnan. His research areas are structured prediction, probabilistic graphical models, variational inference, and deep generative models.

Congratulations to Sanghani Center 2021 Summer and Fall Graduates

Virginia Tech’s Fall Commencement ceremony for the Graduate School is now underway (livestream here) and seven students from the Sanghani Center are among those receiving degrees.  “This has been a tough year and they successfully navigated obstacles caused by the COVID19 pandemic to achieve their academic goals and we are very proud of them,” said Naren Ramakrishnan, the Thomas L. Phillips […]

Eman Abdelrahman

Eman Abdelrahman is a Ph.D. student in the Department of Computer Science. Her co-advisors are Edward Fox and Ismini Lourentzou. Abdelrahman’s research interest lies in applying machine learning and natural language processing on Arabic scientific datasets such as ETDs in order to improve the accessibility to Arabic scientific data.

Xuan Li

Xuan Li was  a master’s degree student in the Bradley Department of Electrical and Computer Engineering. His advisor was Lynn Abbott. Li’s research focuses on continual learning that prevents a deep neural model from catastrophic forgetting in sequential tasks.