Brain Salon

Real World and Synthetic Realities.

What will it mean to be human in a reality that is beyond what we were born into? Realities created in full VR setting, and various other Spatial Computing realities? 


Feb 25, 2020
Piotr Motykiewicz

To start, we may already be hallucinating even in the real world. As Anil Seth has put it in his TED talk: “If hallucination is uncontrolled perception then perception right here and right now is also a kind of hallucination, but a controlled hallucination in which the brain predictions are being reigned in by sensory information from the world, in fact we’re all hallucinating all the time including right now, it’s just that when we all agree about our hallucination we call that reality”. 

To try to answer that question let’s review the current landscape of the various technologies. As some put it we are undergoing a Cambrian explosion in the field of Spatial Computing.
In fact we’re all hallucinating all the time including right now, it’s just that when we all agree about our hallucination we call that reality”.
Spatial.io is working on combining Nreal AR hardware and their own software to create a blended environment where participants across remote spaces can see each other and interact with shared elements in real time. This is an interesting foray into creating a new type of synthetic reality that combines both computer generated and physical objects.


Going further we have research from University of Sussex that showcases a levitating volumetric display that even removes the requirement to use any kind of hardware (VR/AR) by the participating human to be able to observe the synthetic reality. The display uses ultrasound waves to capture particles and illuminates them with color to create animated synthetic objects suspended in air in 3D.


On the hardware side we see a multitude of approaches as well. Mojo Vision is working on contact lens with embedded display that seamlessly allows human wearing it to see information overlaid on the visual signal coming into their eye. 

The display uses ultrasound waves to capture particles and illuminates them with color to create animated synthetic objects suspended in air in 3D.


Here’s a person who scanned their apartment into VR.


3D screens are becoming reality as well, with Looking Glass Factory and PORTL Hologram Company making recent announcement.


The MAD Gaze GLOW will be available for purchase in April, 2020.


"XR Industry analyst Mike Boland, of AR Insider, said it’s still early for consumer head mounted displays and a first mover, even if initially successful, would still be vulnerable to Apple’s 2022 or 2023 launch of its smart glasses. “MAD Gaze is a company to watch,” he said. “They’ve made a lot of headsets. They know how to do that. But,” he added. “Remember, there were a lot of other mp3 players before the iPod came out.” [source]

Here is a cafe in Seoul that looks like it’s been computer generated but in fact is a physical location that has been styled to look like synthetic reality.

Dain Yoon, an artist, is using creative make-up on her face and body to express herself through surreal art, fooling any kind of machine learning algorithms trained for facial recognition in the process.


The lines between real world and synthetic realities are starting to blur, what will be the resulting short and long term consequences for humanity?

Upcoming Brain Salon Event
Electrification of World Transportation
Is Big Oil doomed?
March 2020, Williamsburg and Alpine, NJ.
Attend Event