Survival of the Coziest “In-Your-Face” User Interfaces

UX and Human-Centered Design for Tomorrow’s Cyborgs

Our technology is becoming so integrated into our lives, it’s only a matter of time until we physically merge with it.

Jonah Angeles
8 min readAug 24, 2024
Photo by Bradley Hook via Pexels

The Dawn of Posthumanism

One way or another humanity is going to have to face it:

Our technology is becoming so integrated into our lives, it’s only a matter of time until we physically merge with it.

Our technological era heralds the dawn of posthumanism.

And your average posthuman will want to be comfortable.

For the purposes of this article, I’m using the term posthuman to mean “cyborg,” a being that is, in some way, physically integrated with technology. It’s worth noting that the definition of posthumanism is more broad and wide-ranging.

According to N. Katherine Hayles:

Posthumanism is a philosophical movement that envisions potential transformation and transcendence of human nature through technology, science, and other means.

So where do we stand?

Not posthuman yet, but we are well on our way.

But it depends on who you ask.

Also, whether or not VR headsets like the Meta Quest or Apple Vision Pro offer a cozy user experience will largely depend on whoever’s wearing them.

Regardless, AR and VR technology are becoming more widespread, and may soon be considered common household items.

As our species has evolved, so have our tools — or, more fittingly, our technology. Our primate ancestors have relied on and bonded with tools since the dawn of our species.

Take this scene from Stanley Kubrick’s 2001: A Space Odyssey.

Emerging technologies in the fields of virtual reality (VR) and augmented reality (AR) require us to wear our devices over our faces. The physical boundaries have already blurred.

You could say this bond is getting pretty serious.

Two foundational principles of the UX/UI design field are accessibility and usability, which tie into the related fields of human factors and ergonomics.

Technological innovations are more likely to be adopted if they’re comfortable and intuitive. They shouldn’t feel like chores to use in our daily lives. They have to be seamless — cozy.

This is the basis of Steve Krug’s Don’t Make Me Think, which stresses reducing the user’s cognitive load.

The less thinking the user does, the better.

To parallel Charles Darwin’s concept of survival of the fittest, current technological innovations can be summed up as survival of the coziest.

Whether or not we, users, are aware of it, we “select” our products based on the “coziest” user experience, so to speak, directing the market accordingly.

From “indestructible” Nokia phones to BlackBerrys to the touchscreen-based devices we use today, our phones have evolved a long way.

As odd as it sounds, an ideal these products can aspire to is glasses and contact lenses. Glasses and contact lenses are considered well-designed if the wearer can go about their day without giving them any thought.

With this in mind, glasses and contact lenses are examples of integrated technologies, and that’s where our tech is trending — at least, with augmented reality (AR).

Photo by Maxim Berg on Unsplash

The Extended Mind Thesis

In a seminal work of cognitive science, Andy Clark and David Chalmers wrote about The Extended Mind (1998), which posits that people and tools collectively act as a “coupled system” — externalizing the power of the mind to tools external to the body.

The efficiency of this coupled system is dependent on the design of the tools themselves.

This is a concern that human-centred design (HCD) aims to address, especially when applied to technology.

According to Don Norman in The Design of Everyday Things, HCD is a design philosophy that prioritizes the user experience by addressing the user’s needs while accounting for physical and psychological factors.

Because this coupled system constitutes a complex system (in systems theory), applying a systems approach to user interface (UI) design is essential.

Systems exist on multiple levels, from the biological system (e.g. nervous system) of an average user to the operating system (OS) of a digital device.

Moreso than its predecessors, user interfaces in the 21st century seem to place more emphasis on multitasking, on both a human and OS level; an operating system that can run and execute multiple tasks concurrently allows users to do the same.

For this, our technologies must be mindlessly intuitive — as self-explanatory as possible.

So where is this all relevant with both recent and emerging technologies in the 21st century?

We can start with smartphones serving as extensions of our minds.

Mobile workflow has vastly improved with the evolution of the cell phone to complex, pocket-sized computing systems.

In his book Physics of the Future, Michio Kaku states:

Today, your cell phone has more computer power than all of NASA back in 1969, when it placed two astronauts on the moon. Video games, which consume enormous amounts of computer power to simulate 3-D situations, use more computer power than mainframe computers of the previous decade. The Sony PlayStation of today, which costs $300, has the power of a military supercomputer of 1997, which cost millions of dollars.
- Michio Kaku, Physics of the Future

Kaku made this statement in 2011.

How Good Design “Hacks” Our Psychology

As mobile technologies are becoming increasingly integrated into our lives, it’s important to take into account the psychological factors that drive our usage, and subsequently lead to addiction.

Oulasvirta et al. (2012) identified the checking habit commonly associated with smartphone usage — wherein users compulsively check their phones for notifications; this action of opening one’s phone with the intent of checking one specific item, very commonly leads to other activities that the user may not have had in mind, according to the researchers.

The researchers attribute habit-formation with smartphones to, surprisingly enough, good design:

The user interfaces present users with “quick access to rewards,” be it via the pursuit of endless novelty and stimulation on the world wide web, to e-mails and Facebook notifications, and keeping up to date with news stories.

In other words, the design principles that facilitate our consumption of digital media also contribute to psychological addiction.

The researchers further specified the term “internet addiction” is quite loose and used more colloquially rather than in professional contexts and that it’s better described as “overuse” due to a lack of discipline on the users’ part.

Scott Berkun, in The Role of Flow in Web Design, discusses human-centered design emphasizing flow — a psychological state of complete immersion in an activity.

It seems designing user interfaces with flow in mind would be one way to address technostress, and increasing coziness.

But Will Cyborgs Look Aesthetic AF?

In the same way that brands such as SONY, Skull Candy, and Beats by Dre design their headphones with both fashion and function in mind, companies behind wearable technologies will be influenced by fashion and pop culture.

IDK about you, but this ain’t it. // Photo by Azwedo L.LC on Unsplash

The average posthuman will want to look good, if not fashionable.

Neither is this. // Photo by My name is Yanick on Unsplash

Usability aside, the aesthetic design of VR/AR headsets has a long road ahead — but, from my experiences with the Meta Quest 3 and Apple Vision Pro specifically, the user experiences aren’t terrible.

I would hesitate to use the term cozy.

Bearable, maybe — but not cozy.

Headphones, like wristwatches, are one of the most well-known examples of wearable technologies that people also use for aesthetics and adornment.

The sociological implications of the new waves of wearable technologies are fascinating. For these emerging technologies to reach their potential, public use must not be stigmatized.

At this point, I’m sure we’ve all seen videos of people using VR/AR headsets in public irresponsibly.

Just take the people featured in the video below, ruining it for everyone.

Augmented reality (AR) technologies in particular are of interest to many businesses across the globe, moreso than VR.

As Apple CEO Tim Cook said in 2017:

I also like the fact that it doesn’t isolate. […] I like our products amplifying thoughts and I think AR can help amplify the human connection. I’ve never been a fan of VR like that because I think it does the opposite. There are clearly some cool niche things for VR but it’s not profound in my view. AR is profound.

AR devices such as Google Glass are like glasses that superimpose digital information onto your field of vision.

Weiz et al. (2016) explain “smart glasses,” wearable devices that affect one’s visual perception, such as Google Glass, can “enhance the real world or immerse the user in fully virtual worlds,” but that the subjective norm — one’s perception of what’s normal or acceptable — poses a hindrance on mainstream usage.

It goes without saying that the possibility that someone wearing smart glasses has the capability of recording videos unnoticed — and therein lies the issue: privacy.

In a 2015 International Business Times article, writer Luke Villapaz brands the issue “Glasshole” stigma, stating that the aesthetic design of the product, as well as concerns raised over privacy, prompted Google to hire iPod creator Tony Fadell to oversee development.

With this, it seems Google hopes to redeem Google Glass in the eyes of the public, as a safe and desirable product.

Photo by DIEGO SÁNCHEZ on Unsplash

What Do You Think?

On the cutting edge of the field at present day are neuroheadsets, which use electroencephalography (EEG) to detect the brain activity of users for various purposes, possibly even for deciphering basic mental commands set forth by the user (Mayer et al., (2016)).

With these technologies, privacy seems to be a general concern as well, according to Mayer et al., (2016). The results of their survey were very similar to that of Weiz et al. on the perceived social acceptability of Google Glass.

In this case, privacy was not as large of a concern. The researchers suggest that manufacturers of neuroheadsets “emphasize the instrumental benefits,” and need not worry about what anybody thinks at the moment. This may partly be due to factors such as marketing and branding. Mindflex, a toy developed by Mattel, for example, claims to allow users to control a ball in a maze using their brainwaves.

In The Future of the Mind, Michio Kaku speculates that technology is currently headed in the trajectory of allowing users to use their conscious thoughts to control their devices. Other than novelty devices — such as EEG headbands with cat ears that perk up whenever the wearer’s attention is directly focused towards a particular stimuli or person — EEG and brain-computer interfacing (BCI) technologies have already been implemented with prosthetic limbs.

The integration has been a gradual process, and the more users adopt these boundary-blurring technologies, the more these technologies will continue to become a part of the user. Or, in extreme cases, literal extensions of the users’ brains.

That’s the vision for technologies like Neuralink and Paradromics, among others. These technologies will be surgically implanted. Human participants have already taken these BCI technologies for a spin.

Whether or not humanity collectively embraces and integrates such technology is up in the air.

Either way, your average posthuman will want to be comfortable.

Photo by Adam Neumann on Unsplash

Thanks for reading!

--

--

Jonah Angeles
Jonah Angeles

Written by Jonah Angeles

Nostalgic futurist. Creative nonfictioner. Disgruntled millennial. // https://beacons.ai/fidgetcubeguru

Responses (3)