Skip to content

ShanechiLab/AdaptiveLSSM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Adaptive LSSM algorithm

This repository contains the code for adaptive linear state-space model fitting algorithm (adaptive LSSM algorithm). This algorithm is used for the paper "Adaptive tracking of human ECoG network dynamics". The mathematical derivation of the algorithm is explained in more details in the paper "Adaptive latent state modeling of brain network dynamics with real-time learning rate optimization".

Installation guide

You just need to download the current repository or clone it using git. You can run the codes on Matlab provided that all the folders are added to the path.

Dependencies

This repository does not need any extra dependencies apart from the built-in MATLAB functions.

User guide

The main functions are AdaptiveLSSMFittingAlgorithm_wholeTrial.m which fits the model parameters and prediction_performance.m which computes the prediction performance using the fitted model parameters. There is a test script testScript_adaptiveLSSM.m, to get more familiar with the algorithm. It simulates a time-varying LSSM and generates time-series of brain network activity from it. Then it adaptively learns the time-varying LSSM parameters for each given forgetting factor (learning rate) and computes the prediction performance based on it. Finally, the prediction performance is plotted as a function of the forgetting factor. To test on your own data, you just need to provide the brain network activity time-series, the forgetting factor and a few more setting parameters.

License

Copyright (c) 2020 University of Southern California
See full notice in LICENSE.md
Parima Ahmadipour, Yuxiao Yang and Maryam M. Shanechi
Shanechi Lab, University of Southern California

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •  

Languages