Current IN-BIC Fellows
-
Matthew Trappett
PhD Student, Computer Science, University of Oregon
Advisor James Murray, University of Oregon Collaborating Advisor Boris Gutkin, École Normale Supérieure
My research and interest lies in Multi-Task Reinforcement Learning (MTRL). Specifically, developing RL algorithms that can infer non-stationarity in the environment and continue to improve performance. My IN-BIC Fellowship work will focus on incorporating Homeostatic Regulation into MTRL to improve their performance when faced with novel tasks
-
Vivian White
Undergraduate student, Western Washington University
Advisor Kameron Harris, Western Washington University
Collaborating advisors Guillaume Lajoie and Guy Wolf at Mila - Quebec AI Institute
My undergraduate research project focuses on studying the effects of neuron-inspired weights in neural networks, specifically how weight structure influences network performance. I'm also interested in wavelet scattering networks!
-
Domenico Guarino
Postdoctoral Fellow, Neuropsi, Paris-Saclay University
Advisor Alain Destexhe, Neuropsi, CNRS, EITN (European Institure for Theoretical Neuroscience), Paris-Saclay University
Collaborating advisor Stefan Mihalas, Allen institute
To accelerate the transfer between neurobiology and bio-inspired computing, it’s important to understand the link between natural stimuli, cortical structure, and neuronal population events. In this project, we will explore the hypothesis that the hierarchical modular structure of sensory cortices – where smaller network units are contained within larger ones - is isomorphic to the scale-free structure of natural stimuli – where the relationship between features follows a proportional distribution. We will test the hypothesis applying topological data analysis on two-photon calcium imaging experiments from the Allen Brain Observatory.
Past Fellows
-
Helena Liu
PhD Student, Applied Math, UW
Advisor Eric Shea-Brown, UW Collaborating Advisor Guillaume Lajoie, Mila and UdeM
Coming from a math and electrical engineering background got me fascinated with understanding how individual components work together to give rise to system-level behaviors. This led to my present research on biological learning algorithms, which enable the brain to learn to perform complicated tasks by leveraging upon its diverse array of dynamical and signaling elements. Before starting my Accelnet project, my advisor and I collaborated with the Allen Instituting and wrote a paper on how cell-type-specific local modulatory networks, revealed by recent genetic evidence from the Allen Institute, can promote efficient learning. To complement this study, I plan to examine the solution structures formed by different biological learning rules for my Accelent project. When I am not at my desk, I like to go bird watching, learn history, play piano and go biking.
-
Yuxiu Shao
Postdoctoral Fellow, Group for Neural Theory, École Normale Supérieure
Advisor Srdjan Ostojic, Group for Neural Theory , École Normale Supérieure Collaborating advisor Eric Shea-Brown, UW
I'm an enthusiast of the “Low-rank” LEGO network, working with Srdjan Ostojic. I tried to put in "E-I" blocks, "sparse" blocks and other biological blocks to see how the network connectivity affects the global dynamics.
In this AccelNet project, I'll incorporate another important “E-I motifs” block proposed by Shea-Brown's lab to enrich the computational power of the Low-rank network, and produce a unifying framework for understanding how local E-I connectivity properties determine global dynamics.
-
Shahab Bakhtiari
Postdoctoral researcher, McGill University and Mila
Advisor Blake Richards, McGill University and Mila Collaborating Advisor Michael Buice, Allen Institute for Brain Science
I am a postdoctoral researcher at McGill University and Mila - Quebec AI Institute. My current research is focused on developing artificial neural network models of rodents' and primates' visual systems. I have a PhD degree in neuroscience from McGill University. I also received my master's and bachelor's degrees in electrical engineering from University of Tehran.
-
Yohan Chatelain
Postdoctoral fellow, University of Concordia
Advisor Tristan Glatard, Concordia University and UNIQUE
Collaborating advisor Ariel Rokem, UW
Yohan Chatelain is a postdoctoral fellowship at the University of Concordia Department of Computer Science and Software Engineering. He received a Ph.D. in computer sciences from the University of Paris-Saclay (2019) at Versailles, France. His main research interest is finite precision arithmetic, particularly developing tools for detecting and analyzing numerical instabilities in scientific computations programs. He is the main developer of Veritracer and PyTracer, two tools for visualizing the numerical quality of floating-point computations over time. The AccelNet project will focus on understanding the magnitude, origins, and implications of numerical instabilities presented in for human brain tractometry.
-
Mathias Peuvrier
PhD student, Neuropsi, CNRS (Centre National de Recherche Scientifique), Sorbonne University
Advisor Alain Destexhe, Neuropsi, CNRS, EITN (European Institure for Theoretical Neuroscience), Paris-Saclay University Collaborating advisor Stefan Mihalas, Associate Investigator, Allen institute
I am a PhD student working to understand REM sleep, using both experimental and modelisation approaches. Analyzing multi-site mice LFP data we looked at cortical dynamic in REM and characterized the delta oscillation that can be observed in somatomortor areas, while the rest of the cortex display asynchronous activity. Then, we reproduced this REM features in a whole brain model made of Adex mean-fileds connected in a large-scale network based on the mouse connectome (from the Allen Brain Atlas). We will now investigate the question of the responsiveness and the activity spreads across brain states and areas, using an approach based on machine learning and multi-area RNNs developed in S.Mihalas team, in order to understand the gating of information and its routing to different brain areas, as a function of brain state.
-
Pavithra Rajeswaran
PhD Student, University of Washington
Advisor Amy Orsborn, UW
Collaborating advisor Guillaume Lajoie, Université de Montréal Mila, Quebec AI Institute
I am a third year PhD student at UW BioE. My PhD research at the Orsborn lab explores the neural signatures of learning and skill consolidation.
Linking neuroscience and artificial intelligence (AI) research could be a great way to dissect learning dynamics and advance BCI research. Computational models like Artificial Neural Networks (ANNs) can be leveraged to understand principles of how networks of neurons learn. ANNs provide an opportunity to dissect learning dynamics in fully observed networks, which cannot be done in neuroscience experiments. With AccelNet funding, we fostered a collaboration with Lajoie lab (Mila - Québec AI Institute) to investigate the similarities and differences in how the brain vs. an artificial network learns to solve a BCI task. Our work probes different neural strategies of learning and will enhance our understanding of network level learning in brain circuits. These findings could also translate to biologically inspired and informed AI algorithms.
-
Pankaj Gupta
PhD Student, University of British Columbia
Advisor Timothy Murphy, University of British Columbia
Collaborating advisor Adrienne Fairhall, UW
Pankaj is a fourth year PhD. Neuroscience student at University of British Columbia. He did his bachelors and masters in Computer Science before his current stint. He is interested in neuroscience research related to neuroplasticity, brain wide computations, optogenetics and brain-machine-interfaces. In his free time, he likes to cook, go for hikes or swim. His collaboration project with AccelNet network is aimed at bio-inspired modelling of multi-regional neural recordings as an artificial neural "network of networks". Such modelling of large scale neural recordings from multiple brain sites can reveal asymmetric connectivity and task dependent activations since we can harness fully observable weights of the trained network.
-
Jack Price
PhD Student, University of British Columbia
Advisor Kurt Haas, University of British Columbia
Collaborating Advisor Kaspar Podgorski, Allen Institute
I am a graduate student at UBC in Vancouver Canada. I would describe my area of research as neuroscience from an engineering perspective. I am currently working with a team to build both the software and hardware for a new microscope called SLAP2 which has increased imaging spreads capable of tracking information flow through intact brain at subcellular resolution. Once the microscope is complete, my group will be able to ask and answer important questions that are foundational to the field of neuronal information processing and encoding.
-
Ryan Vogt
PhD student, University of Washington
Advisor Eli Shlizerman, University of Washington
Collaborating Advisor Guillaume Lajoie, Mila & Universite de Montreal
I am a PhD Candidate who is studying the mathematical properties of Recurrent Neural networks using neuroscience-inspired tools. For my project, I am studying how dynamics of hidden states of RNNs relate to backpropagation of gradients during learning. Furthermore, we will explore means by which this relation can be leveraged to improve learning rules.