23
votes
Videoconference fatigue from a neurophysiological perspective (first neurophysiological evidence)
Link information
This data is scraped automatically and may be incorrect.
- Title
- Videoconference fatigue from a neurophysiological perspective: experimental evidence based on electroencephalography (EEG) and electrocardiography (ECG) - Scientific Reports
- Published
- Oct 26 2023
- Word count
- 10 248 words
For all the hate that Facebook gets with its metaverse attempts, I think that their work on VR videoconferencing seems brilliant and could easily reduce a lot of that fatigue.
They recently made the news with the Lex Fridman interview with Mark Zuckerberg, but that only shows a part of the technology. Having a realistic avatar is not that useful on its own, their goal is to have a VR conference with spatial sound, face tracking including realistic eye contact, and hand gestures. This means that you immediately know who's talking and at whom, you can have a quiet side conversation with the person sitting next to you without interrupting everyone else, a significant portion of non-verbal communication is reproduced etc.
For me personally this would remove many of the things that make videoconferences tiring and annoying.
This is still so expensive that folks at Meta have to make a special request to get a setup. I’m sure that eventually it’ll probably be a big help, but I would have expected it to have happened by now. One of the worst parts of a remote meeting for me is a wall of photos and default avatars, nobody knows how their words are received or if anyone is even paying attention.
IIRC another interview with Zuckerberg, the photorealistic avatars were not an initial goal and instead they tested the other necessary features like hand gestures, face tracking and spatial audio with non-realistic simplified avatars first. Those were the ugly Metaverse heads seen on some screenshots, and with those I think it's viable right now.
The realistic heads are probably necessary for adoption though, that's true.
I recently listened to a podcast which profiled startup Neurable. Basically they are making headphones which can pick up data from your brain and let you know when your attention is drifting. Sounds like this technology could help solve some of these issues.
That technology sounds inhumane, highly invasive, and dystopic as well. Like something written by George Orwell.
Their promo video certainly doesn't help dispel that, either. It's like something I would expect to see in Black Mirror.
I’m trying to imagine typing by thinking now.
Dear, my manager,
I’d like to request I need to remember to buy some milk a week off next month. Ugh, I should replace this chair. I have no idea what I’d get, though. My parents are coming to visit. I really need to scour the house or I’ll never hear the end of it. I guess I’ll have to pick them up from the airport, too, mom’s afraid of getting attacked on the train and they can’t figure out Lyft. What a pain, the train is so much simpler.
Sincerely,
Tanglisha maybe they’ll turn down this request and I can avoid this whole mess
That video is so dystopian I'm having a hard time believing it is not satire.
That also reminds me of that device that alerts truck drivers when they're falling sleep while driving... How about just giving them reasonable hours instead?
Most countries I know of have already legally restricted truck drivers' maximum daily driving hours. E.g. Here in Ontario, Canada, the maximum driving time is 13 hours per day. So, AFAIK the issue of drivers falling asleep due to long hours is caused more by low pay due to stiff competition, and the truck drivers themselves attempting to make up for that by driving longer than they're legally allowed to, so they can finish jobs faster and take the next job sooner. But those drivers are typically doing that on the sly and without the companies they're working for or were hired by knowing about it.
However, a lot of independent drivers these days have also started getting around the legislation, while not technically breaking the law, by working in shifts with other drivers in the same vehicle. When one hits their legally allowed maximum driving limit, they switch off to their driving partner, then go get some rest in the bunk area. When that driver reaches their limit, the first one returns. And by doing that those trucks can essentially keep driving forever, and many are actually coming close to doing that, which is leading to maintenance issues causing accidents, since they often don't stop long enough to properly maintain their vehicles.
p.s. My BiL is in the trucking industry, working as a cab and trailer salesman, so I hear about this inside industry shit all the time. ;)
It was not my intention to start a well reasoned discussion about truck driving conditions, but that is just the nature of Tildes :P
I just used to see ads for such a device late at night and thought it was pretty funny and also a bit tragic.
Why not both?
I hadn't previously watched the promo video. I just did. Wow...fantastic! This is the future I was promised and want.
Did you forget the /s, or are you serious? Because that tech sounds like dystopian nightmare fuel to me.
I have no interest in having my brain hooked up to any technology but I feel like you're taking it too far. Sure, this kind of thing could be used to do some pretty bad stuff and there will definitely be attempts to do it someday. But it can also have huge upsides. Think about the advantages for a quadriplegic or someone with Locked-In Syndrome. With proper restrictions, this could be incredibly beneficial.
I'm curious how this would be for someone with ADHD. Would their brain always need a break? Would it be beneficial to be made aware when you're losing focus? Would it help identify the methods that actually help you as an individual?
Could this help share what someone's subjective experience of the world is like? When a family member gets frustrated because I forgot to do something, would the ability to demonstrate what goes on inside my head help them to understand why I can't just remember things without external stimuli?
I have ADHD and it's pretty severe. This device would stress me out because I would still have all the symptoms of ADHD and also a machine pestering me.
What I need is treatment, possibly a cure. Something medical. Not a portable slave master.
I mean, sure, that tech likely has plenty of legitimate and highly beneficial uses like you gave examples of. But in the context of this post about videoconference fatigue, and Neurable potentially being used to let you know when your attention is drifting, it sure sounds dystopian to me... especially if companies started mandating its use for employees. So I feel like I'm not really taking it all that far when expressing my apprehension about it.
Really only if companies did that. I can't think of why a little ding telling you your mind is drifting is dystopian otherwise.
Why is your mind drifting some failure to be addressed in the vast majority of instances? That implicit judgment is the sinister part, it really doesn't matter if it's a cultural shift or tyranny of some sort.
Because I want to be able to pay attention to what people are telling me. It's not sinister to want to listen to people who are important to me.
I guess I see those unconscious decouplings mid-conversation as part of the necessary elements of the value of conversation. When people are laser-focused the whole time, that's when things start to get convoluted and bitter in my experience.
I'm sorry for the implicit judgment in my earlier response, though. I saw your comment as a complaint about others, rather than one of looking for a personal solution. Clearly that's not the case. I still have a hard time not seeing any use of such a tool as inevitably problematic, but I shouldn't have responded how I did.
Thanks, appreciate it.
Regardless of the specifics, having tech like thos remotely associated with an employer would inevitably result in discrimination against people with ADHD.
Why does it have to be associated with employers?
Because it's completely naive to assume that employers will abstain from using this technology to monitor their employees unless there's very specific targeted regulation against it -- and honestly even in that case they may still find loopholes depending on the details of legislation. Short of a complete overhaul of the power dynamics between employers and employees, particularly in the largely un-unionized US, technology like this would inevitably be used for employee monitoring by employers.
Sure, it's likely some employers will try to use it for those purposes, but that doesn't mean the conversation has to exclusively turn to that. You can also discuss the benefits.
There are no benefits that would outweigh the harm of one's employer monitoring your very attention span, especially for those of us with ADHD.
And there are no benefits that would outweigh the harm of one's employer monitoring you with a camera, yet you don't bring that up every time a new camera is released.
I would argue there certainly are benefits to the existence of video cameras (or even webcams specifically) that outweigh those harms, but more importantly, those existed before I was born. I grew up in a world that had them. Whatever hypothetical battle there would be to prevent cameras from becoming a household item would have happened before my time.
I'm not obligated to be optimistic about a new technology. I don't have to talk about potential benefits of technology that I believe will inevitably be used in ways I find super harmful. Every time facial recognition comes up, I talk about how it's abused by law enforcement against minorities for whom the technology is less accurate to begin with, because that's a huge issue with that technology. I'm not then obligated to talk about all the cool benefits the technology I'm advocating against could have.
I wonder if it’s related to the work on data entry for folks who don’t have the use of their hands or voices.
If it's forced on somebody or the data is shared anywhere, yeah. But it also sounds like a potentially pretty useful mindfulness tool. As someone who suffers from symptoms often described as "brain fog" as a part of a chronic illness, I'd love to have a tool that notifies me my attention has drifted earlier than when I notice it, because it often takes me a nontrivial amount of time.
I've got ADHD.
I'll just do what we should be doing. Step away from the computer and take a break. Establish strong boundaries around what you need vs what the organisation wants and go from there.
Tech can't solve this problem without it becoming "WORK HARDER" and we need to move away from that.
You're still only talking about one use case. I'm not just talking about working for someone.
I have this problem when working on my own projects that I actually want and like to work on. I also have it when I go on a walk in a forest and simply want to be present = focus on my surroundings (I wouldn't use the tool for that, I'm mentioning it as an example - this is not a "set boundaries for your job" problem).
I understand the distaste towards things that focus on increased productivity, but I don't think that everything in that category is bad or only useful for a boss who's pressuring you into something you don't want to do.
That said, if it was to be used for monitoring employees, I would hope that the EU is sane enough to quickly ban it.
It would be nice to get a little ding when you aren't listening to what your SO or friend or anyone else is saying. Or maybe when you're not listening to your subordinates when they're briefing you on what's going on in their work. Or when you're watching a TV show and missing half of it because you're daydreaming or stressing out even though you wanted to decompress. Mind monitoring would obviously be pretty damn bad, but there are a lot of upsides to it.
That said, I don't want something in my head. I want to be me and to improve myself through my own actions...although there's a decent chance I could be convinced to use something that is actually actively improving QOL.
To each their own, I guess. Obviously if forced upon someone, it's a no go. But the technology (by way of EKG, etc) already exists today and isn't being forced on anyone. As a tool that I can choose to use, I think it's fantastic. They are iterating over the product and hope to have the ability to allow you to control basic functions via just thinking it. E.g., think "skip song" and have Spotify skip the song.
I think the concern is more about how employees will cope if monitoring tech, and in this case very invasive monitoring tech, is widely adopted by employers. Now your boss gets a breakdown of how much you're "distracted" giving them yet another lever to pull for efficiency and productivity. Large employers already have a pretty poor record of selecting heuristics for success on their teams, so this seems like another step on an already depressing ladder.
Real-Estate Leaders everywhere:
"THAT'S IT EVERYONE, ZOOM IS BAD FOR YOU. BACK IN THE OFFICE!"
Everyone else:
"Right, maybe we just go back to a quick phone call and emails?"
This is DAMN Fascinating. I want to bury my head in the whole document before I make any judgements. But I reckon this is my old 'data transformation' head needing to be in place here. Because this will be another "What technology can we use to stop this happening!" argument, rather than a 'How can we change our Ways of Working to prevent this in the future?'
I've been remote for years. I always hated when meetings required or strongly encouraged having video on. Virtual meetings don't drain me. Having to be on camera the whole time drains me. I've found I do noticeably better in my attentiveness and retention when I'm able to just listen in to the meeting rather than actually be on video the whole time.
I think if you're contributing heavily, or guiding... that little human element works really well. But if you're just consuming? It's absolutely non-essential as long as you're paying attention.
It's a balance, and it's individual and to be agreed between grownups, rather than blanket madness that so commonly happens.
I think this is a good start looking into this, but equating a college lecture with a work meeting isn’t helpful. I did glance at one of the other studies referenced with
That other study was very focused, it looked at idea generation and selection. It found a difference with idea generation, but no significant difference with idea selection. Pictures of idea based meetings are very popular on company websites. Working directly with ideas is the kind of meeting that folks would like to be the norm, but in practice that is usually not the case.
I think it would make more sense to do a study like this using a real office workplace and comparing the readings found from in person meetings vs video meetings. I’m also curious how much of a difference it makes to shut off video, I’ve heard folks use fatigue as a reason for doing this but not if it actually helps. Multitasking is more obvious in person, it’s common in some office cultures to keep working and also be physically present in a meeting. Some folks even attend more than one meeting at a time over video. OF COURSE that will cause fatigue.
There’s another factor that I’m not sure these folks are aware of: Many companies build teams out of folks who are geographically separated. The place I used to work put a lot of thought into meetings that had mixed in person and video participants, though not a formal study. We always had more remote participation when everyone was on video (as a single person) than when half or less of the folks in the meeting were on video. When in the minority, the folks on video always ended up feeling like “second class citizens”. This was at its worst when there was a whiteboard involved.
Apologies that this is somewhat rambly and unfocused. I have worked in environments where folks actively worked to make meetings as productive as possible and environments where showing up to a meeting was a status symbol, so folks would attend as many as possible based on who else was invited and the project(s) involved. A laboratory environment can certainly normalize the content itself, but that assumes all meetings are of the same value.
You mean you don't want to be in meetings where you just give updates to the work that's going on and shit talk to your peers whilst the big boss sits in and just hamfists their way through whatever it is you're talking about? Got it. MORE OF THAT.
I do ask folks not to multitask during meetings, as a dude with ADHD it's painful to try and work through a lot of the time. Both in person and online, but you see it so much online.
There's a definite issue with Hybrid just not fucking working, ever. I've never seen it work well.
You're either in person, or you're all remote and contributing. The combination just absolutely sucks.
It's pretty clear dude don't worry.
I'm senior leadership and I very rarely impose how I work on others, but it's so damn important to get a few base rules out of the way and "hybrid doesn't work" is a biggie. But also it's not on for everyone to get together constantly. We should have dedicated 'in office' times that are flexible with those needs and then "in-office, out-of-office" kicks in. I don't want folks coming to the office to sit on Zoom calls and code, I want them working to fix and ideate.
But the C-Suites of firms don't get that and can't trust that it works. Morons.
Great prediction, look at all the other top level comments.
It's always the case. I see it time and time again when I go consulting around businesses.
PLEASE MAKE THE TECH WORK BETTER... it's a people or ways of working problem that people daren't address.