Adding EOG to our Workflow

Adding EOG to our Workflow

Our lab at MSU started from a few boxes of equipment in an empty room, and has since grown into a burgeoning and cooperative lab environment running two EEG systems.

Amidst our growth, we have learned several new techniques for gathering EEG (electroencephalogram) data, processing it, and interpreting it. Recently, we have incorporated EOG (electrooculogram) into our lab.

When you build a lab from scratch in a field that’s relatively new with no experts in your department, nothing is handed to you, so we have had to push ourselves to get everything to a field standard. After some time, we have achieved so much, but EOG was still not a part of our workflow.

Googling something like “how to use EOG” or “what to do with EOG” doesn’t help much, especially when everyone uses different software, hardware, or they are in different fields entirely, which changes their whole methodology and usage of various technologies.

So, though Google and Wikipedia can tell you what something is, the nitty gritty details of using something like EOG, say, checking compatibility with your other hardware, interpreting the data, etc., are left for you as a lab researcher to figure out. Sure, you can seek help from other more knowledgable experts, but at the end of the day, making it work for you is really up to you.

Some background: we use ANT Neuro for our hardware, which we pipe into MATLAB plugins called EEGLAB and ERPLAB that processes the raw data into numbers, which we then analyze using R. In neurolinguistics, we can use EEG to look at event related potentials (ERPs) which are indications of changes in neurophysiological activity in the brain with respect to some event. For example, say you see a series of Xs, like this–

X X X X X X X X

–displayed one at a time on a computer screen in front of you. You may become rather accustomed to seeing Xs, so something else may throw you off your game:

X X X X O X X X

Once this pattern is broken, your brain had to compute that indeed the pattern of Xs you were so used to seeing is now broken, which I guess can be upsetting for perhaps both you and the brain. This is a classic paradigm known as an Oddball Paradigm, but useful for us here to illustrate that behaviorally, we recognize this change, but at the neurophysiological level, this difference is also easily recognizable.

At the O, we would mark about a second before and after to set a window of time in the EEG data to focus on, and we would take every window of this O being presented to the participant and average them together. By doing this, we get a picture of what happens in the brain when this event is occurring. What we would find is that there is a change in electrical activity relative to a baseline condition, in this case, seeing Xs. These changes in electrical activity for us are evidence that the brain is doing some extra processing work. At the end of the day, we are interested in seeing where in the flow of processing information the brain is spending some extra effort.

For linguistic inquiry, we apply this kind of logic to sentences, where during some portion of sentence processing that has some kind of behavioral response (a slowdown in reading because there is an odd word or structure), we measure that part of the sentence using EEG.

In this endeavor of EEG work, there are many artifacts that can get in the way, such as moving around, blinking, greasy hair, etc., so we try our best to remedy most of these. For eye movements and blinks, EOG can be used to record electrophysiological data of these events, and eventually, we use this data and subtract it from the rest of the EEG data reducing overall noise.

Here’s how we went about including this extra data: first, we set up two sets of bipolar electrodes, one set above and below the eye or VEOG (vertical-electrooculogram), and one set to the sides of the eyes or HEOG (horizontal-elecrooculogram). Each set has two electrodes to act as kind of reference points so that when the eye moves from the center, we can use both electrodes to calculate where the eye is looking as well as the much noisier (with respect to EEG data) muscle movement of blinking. Here is a photo of one set of bipolar electrodes next to our amplifier and the electrode connections for 32 of the 64 available ports.

bipolarelectrodes

 

Once this data is recorded, we can use it to subtract blink and eye-movement artifacts that are read into the other electrodes on the cap.

The actual subtraction methodology we use is from a paper in 1983 from a group of folk Gratton, Coles, and Donchin (see Gratton, G., Coles, M. G., and Donchin, E. (1983). A new method for off-line removal of ocular artifact. Electroencephalogr. Clin. Neurophysiol. 55, 468–484. doi: 10.1016/0013-4694(83)90135-9).

Basically, what it does is takes the EOG data and removes it from the other electrodes on the cap with a regression value based on how far away the other electrodes are. So, for electrodes rather close to the eyes like FPz, blinks and eye-movements are much more visible here, so there is a greater subtraction, but electrodes at the back of the head that rarely show blinks at all with have a much lesser regression.

Okay, so we have the methodology, we have the data, now what? How do we use this in EEGLAB or ERPLAB? A few options: one, does this software natively have it? No. Should we write it ourselves? We can, but let’s not reinvent the wheel and see if someone else has contributed. And, indeed, they have. If you would like to use this correction in your EEGLAB, check out this link here. Here is some data I collected on myself today that’s been filtered and processed–

 

before-correction
Before applying correction

 

after-correction
After applying correction

 

Note that the highlighted region shows a good example of this correction smoothing out a negative deflection in the frontal poles and other electrodes.

Granted, this isn’t the only way to go about using EOG data and accounting for eye artifacts, many others utilize various methodologies that work for them. Some use independent component analyses (ICA), others use more complex forms of regression and correction, and some use methods that don’t require EOG data at all. But, at the end of the day, I’m glad we’ve included this kind of information in our workflow, because now our data is cleaner and our lab is happier.

 


Leave a Reply

Your email address will not be published. Required fields are marked *