2021 Virtual EEGLAB Workshop. From June 14 to June 18, the 30th EEGLAB workshop will be held online inside Gather.Town, an online conference platform. The first day will be free for all and will feature lectures and social events. The time has been arranged to accommodate American and European time zones. Other events will accommodate Asia time zones in the future. The event will features lectures on how to process data using EEGLAB including scripting and the use of multiple plug-ins. Parallel sessions will discuss specific topics, in particular, wearable EEG, deep learning and EEG, connectivity analysis and EEG. We are looking forward to see you at this online event. Click here for more information
Here we highlight new EEGLAB plug-ins of possible wide interest to EEGLAB users. Please send descriptions of new plug-ins for consideration. These should have a brief lead introduction, and further text and images to be published on a continuation page.
MATLAB Viewer/Recorder. The MATLAB Viewer/Recorder is an EEGLAB plug-in and standalone compiled MATLAB tool to record EEG data in an EEGLAB session. The plug-in was built by Christian Kothe and Arnaud Delorme, and uses popular LabStreamingLayer (LSL) protocol. The plug-in saves the recorded EEG to your computer as an EEGLAB dataset. Using it, recording EEG can be as simple as: (1) Turn on your EEG headset, (2) Connect the plug-in to your headset using provided LSL tools provided for it (must be downloaded separately), (3) Connect to the LSL stream, and (4) Press the record button. Once recorded, you may then manipulate the resulting EEG dataset like any other EEGLAB dataset. The MATLAB Viewer/Recorder plug-in is available from the EEGLAB plug-in manager (select EEGLAB menu item File then submenu item Manage EEGLAB extensions) and also directly from its GitHub repository.
This section contains personal profiles of EEGLAB developers and/or users, with a description of how they use EEGLAB in their research.
Fiorenzo Artoni, Ph.D.
Maître Assistant, Functional Brain Mapping Lab,
University of Genève, Genève, Switzerland
It sounds like a Sci-Fi movie: A man loses his hand in an accident. Scientists fit him with a state-of-the-art bionic hand, using artificial intelligence and robotics. They use algorithms to decode the patient's brain signals, which can predict how he plans to move his fingers, and robotics to help him grasp objects. The patient learns to move his new artificial hand with ease, returning to his normal activities instead of facing a once-debilitating condition.
This seemingly fictitious scenario is quickly becoming reality, thanks to researchers like Dr. Fiorenzo Antoni, Biomedical and Automation Engineer at the Functional Brain Mapping Lab in Switzerland. Dr. Artoni recently completed a 2-year study called BIREHAB, aimed at creating the tools needed to build a more 'life-like' prosthetic hand with robust and real time ICA, to improve the lives of amputees. "The aim of the project," Dr. Artoni explains, "was to develop both software and hardware tools to characterize ‘referred’ sensations that an amputee can actually feel on its phantom hand." Dr. Artoni hopes to use his diverse research to help people who suffer from brain damage or have lost a limb -- to help them gain independence and lead a productive, happy life.
EEGLAB has proven to be an essential asset in Dr. Artoni's research. In collaboration with Drs. Makeig and Delorme, he has also helped to develop RELICA, a toolbox now offered within the EEGLAB distribution. "RELICA is a novel method to characterize Independent Components reliability within subjects," Dr. Artoni adds. Read more»
Mobile Brain/Body Imaging: A One-Day Gathering will be held on June 7, 2021,at Gather.Town and on zoom. The event will include talks and opportunities to connect socially as a community until we can meet for the full, in-person 4th International Conference on Mobile Brain/Body Imaging, which has been postponed to Summer 2022. Registration information will be announced soon (via the eeglablist and the SCCN website). More information»
The 31st EEGLAB Workshop will be held online, June 14-18, 2021, in a virtual conference center on GatherTown. Registration will be announced soon (via the eeglablist and the EEGLAB home page). More information»
The Second Hands-on LSL Workshop has been postponed to 2022.
The 31st EEGLAB Workshop in Lublin, Poland, is postponed to 2022. The dates have not been set yet. For more information, contact Dariusz Zapała (email@example.com). More information»
(… the EEGLABLIST email list) This section contains brief questions and answers from the eeglablist archives or elsewhere.
Q: Cedric Cannard, Ph.D. Candidate, CerCo, Paul Sabatier University, Toulouse, France: I am analyzing 64-channel EEG data on the pre-stimulus period [-1500 0] ms (3 conditions) and will use LIMO OLS on the ERP and ERSP data. I read in your papers that non-causal filters can contaminate the pre-stimulus period with post-stimulus activity. Should I use causal filters?
A: Andreas Widmann, Scientific Lab Manager, University of Leipzig: Causal filters typically should only be used if the application explicitly requires this. For example, if causality matters as in the detection of onset latencies (even if the problem is overestimated as it mainly affects ultra-sharp transients typically not observed in EEG/ERP), the analysis of small fast components before large slow components (e.g. if higher high-pass cutoff frequencies are required), or in the analysis of pre-stimulus activity, that is, your case. The difference between a linear causal and linear non-causal filter is exclusively the time axis. The output of the non-causal filter equals the delay corrected output of the causal filter. It is sufficient to change the EEG.times time axis. That is, if your signal of interest is further away from stimulus onset than the group delay, you can simply use a linear non-causal filter.
sig = [ 0 0 0 0 0 1 0 0 0 0 0 ]; % test signal (impulse)
b = [ 1 1 1 1 1 ] / 5; % some crude boxcar filter for demonstration purposes only, linear-phase, length = 5, order = 4, group delay = 2
fsig = filter( b, 1, sig ); % causal filter
plot( -5:5, [ sig; fsig ]', 'o' ) % the filtered impulse in the output does not start before the impulse in the input
fsig = filter( b, 1, [ sig 0 0 ] ) % padded causal filter
fsig = fsig( 3:end ); % delay correction by group delay, this is what makes the filter non-causal and zero-phase
plot( -5:5, [ sig; fsig ]', 'o' ) % the filtered impulse in the output starts before the impulse in the input BUT everything before x = -2 is unaffected
Click here for additional Q&As related to causal filtering »
Advances in EEG Neuroimaging National Institutes of Health (NIH) Presentation (20 min)
given by Scott Makeig to the BRAIN Initiative Workshop: Transformative Non-Invasive Imaging Technologies, March 9-11, 2021