• Hey, guest user. Hope you're enjoying NeoGAF! Have you considered registering for an account? Come join us and add your take to the daily discourse.

Valve Reveals Partnership with OpenBCI to Make VR Gaming More Immersive

Romulus

Member
Valve says dev kits will be out by 2022


Newell says the partnership is working to provide a way so “everybody can have high-resolution [brain signal] read technologies built into headsets, in a bunch of different modalities.”

OpenBCI says Galea gives researchers and developers a way to measure “human emotions and facial expressions” which includes happiness, anxiety, depression, attention span, and interest level—many of the data points that could inform game developers on how to create better, more immersive games.

Provided such a high-tech VR headstrap could non-invasively ‘read’ emotional states, it would represent a big step in a new direction for gaming. And it’s one Valve clearly intends on leveraging as it continues to both create (and sell) the most immersive gaming experiences possible.


open-bci.jpg



 

Jaxcellent

Member
This Is super awesome actually, so NPC's can react to the way you feel, this could really screw with your mind. Imagine a jump scare with this tech.

With VR I wish I could peek into the future alittle, see where it goes. I'm still amazed how good PSVR looks, I dont think we will be in "Ready player One" territory anytime soon, but it will be a hell of a journey.
 

Romulus

Member
This Is super awesome actually, so NPC's can react to the way you feel, this could really screw with your mind. Imagine a jump scare with this tech.

With VR I wish I could peek into the future alittle, see where it goes. I'm still amazed how good PSVR looks, I dont think we will be in "Ready player One" territory anytime soon, but it will be a hell of a journey.

I think at the absolute very least, this shows that one of the richest gaming companies in the world is still very invested in VR technology. But, it's not like the tech is going anywhere with Oculus Quest success either.

I can almost imagine how this excites people that work at Valve. The possibilities with stuff like Half Life or even new series....
 

Hudo

Member
This Is super awesome actually, so NPC's can react to the way you feel, this could really screw with your mind. Imagine a jump scare with this tech.

With VR I wish I could peek into the future alittle, see where it goes. I'm still amazed how good PSVR looks, I dont think we will be in "Ready player One" territory anytime soon, but it will be a hell of a journey.
This would require game devs actually giving a shit about AI beyond simple Markov Chains instead of pushing graphics, so not sure whether this will actually come true (it sounds like a really cool idea, though!).
 

Moochi

Member
Integrating a system like GPT-3 into dialog and quest chains would go a long way. Doing that without breaking the game would be next to impossible with current methods. I'd imagine you'd have to build a neural network that had a discriminator to identify valid game states, dialog, quest completion flags and variables, etc.

In effect you would still be bound to Markov Chains, but the complexity of the system would be so high there would be the illusion of freedom.

Imagine the compute resources needed to do physically accurate ray tracing and particle simulations. Then take that and multiply it by several extra dimensions. The idea of using real AI on every NPC is probably unattainable for a few decades. A more limited approach will come first. Imagine a single companion character who would respond and act in ways that could almost pass the Turing test kept "on script" by a discriminator or game master, so that it's like playing with someone who never breaks character or breaks the fourth wall.
 
Last edited:

pr0cs

Member
It's gonna need a lot of improvements in latency before it adds a lot of value to gaming. I'm not against it but I expect the first iterations will be pretty meh
 

IntentionalPun

Ask me about my wife's perfect butthole
This would probably start out as being used by game companies themselves to influence the design of a game, not something sold to consumers.

There are companies that use devices like this to tell you how your website or marketing emails/social media posts are "emotionally engaging" users. Imagine a car company using this to measure how much their commercial makes a man feel fucking manly.

Now imagine a game company using it to measure how fun/exciting/scary a game is.. or how emotionally engaging a cut scene is.. and then they can do a/b testing. "This version of the level measured as less exciting as this other version."

That's why Gabe says "in their test labs" and says "the data is too important."

It's about collecting the data, not controlling the game itself.

My guess at least; getting this in consumer hands for actually influencing games in real time is far off IMO.

Lord Gaben said:
“If you’re a software developer in 2022 who doesn’t have one of these in your test lab, you’re making a silly mistake,” Newell tells 1 News. “Software developers for interactive experiences — you’ll be absolutely using one of these modified VR head straps to be doing that routinely — simply because there’s too much useful data.”

This wouldn't be for "VR Games" only either.

the article said:
OpenBCI says Galea gives researchers and developers a way to measure “human emotions and facial expressions” which includes happiness, anxiety, depression, attention span, and interest level—many of the data points that could inform game developers on how to create better, more immersive games.

They are just misconstruing this as something that will come soon to VR games.. why would it even be limited to VR in the first place, ever, if it is used to control games? lol
 
Last edited:

Romulus

Member
This would probably start out as being used by game companies themselves to influence the design of a game, not something sold to consumers.

There are companies that use devices like this to tell you how your website or marketing emails/social media posts are "emotionally engaging" users. Imagine a car company using this to measure how much their commercial makes a man feel fucking manly.

Now imagine a game company using it to measure how fun/exciting/scary a game is.. or how emotionally engaging a cut scene is.. and then they can do a/b testing. "This version of the level measured as less exciting as this other version."

That's why Gabe says "in their test labs" and says "the data is too important."

It's about collecting the data, not controlling the game itself.

My guess at least; getting this in consumer hands for actually influencing games in real time is far off IMO.



This wouldn't be for "VR Games" only either.



They are just misconstruing this as something that will come soon to VR games.. why would it even be limited to VR in the first place, ever, if it is used to control games? lol



Newell himself mention headsets, it's not just the article.
Newell says the partnership is working to provide a way so “everybody can have high-resolution [brain signal] read technologies built into headsets, in a bunch of different modalities.”

The way a headset can track your eyes is likely more a better connection to the brain too. And immersive technology just doesn't make sense without VR or AR imo, it flat sucks at it.
 

IntentionalPun

Ask me about my wife's perfect butthole
Newell himself mention headsets, it's not just the article.


The way a headset can track your eyes is likely more a better connection to the brain too. And immersive technology just doesn't make sense without VR or AR imo, it flat sucks at it.
Sure.. in the far future. Like my post said.. the initial rollout of this tech would be for analysis of any game, not just VR games.. the article purposefully doesn't make this very clear.

The Dev Kits mentioned for instance are not likely for actually controlling games.
 

Romulus

Member
Sure.. in the far future. Like my post said.. the initial rollout of this tech would be for analysis of any game, not just VR games.. the article purposefully doesn't make this very clear.

The Dev Kits mentioned for instance are not likely for actually controlling games.

I wouldn't expect full games with brand new tech either.
 

Retinoid

Member
So VR finally made Valve wake up and realize they are supposed to make video games. I'll take it, for what it's worth. Alyx was a really strong showing for the potential of VR games and even though I wasn't the biggest fan of it, at least Valve are making shit again. They've been sitting on Steam money for far too long.
 
Last edited:

Jaxcellent

Member
We are a long way of from real AI NPC's sure, also pretty far off of making a realtime brain scans. But IMO the facial expression detection part is possible. it could be used more like an input device:

For example: your in a scripted conversation with a NPC, the input could be options you select on screen, or even better, you read the option on screen, the game detects the sentence read and option is chosen.. anyway

The NPC (example: "mode 1 normal") makes a joke, the hardware/software combo picks up if your laughing/smiling, or not, maybe you think he is making a stupid joke. etc, (your not smiling) this can be seen as an input for the next chain.. there could be 2 possible reactions, the NPC ("mode 2") he is friendly to you, because you smiled at his joke.
or the NPC ("mode 3") who dislikes you because you diden't smile at it's joke.

It will start with small things guys, but we will get there...
 
I think by 2030 headset VR will be a thing of the past and you'll just be in a lucid photorealistic dream while lying down.

either that or AI or extremists will have us killed
 

Kenpachii

Member
How? You can play it on like 7 different headsets. No restrictions to Valve hardware.

The only reason they made it was to sell there shitty headset. If they didn't wanted to sell a headset that game was never made.

The successor of half life probably requires one of those things while at it. Why? because they probably sell one of them themselves.

Or you know they could just make a game and not tag it towards plastic nobody cares about.
 
Last edited:

Romulus

Member
The only reason they made it was to sell there shitty headset. If they didn't wanted to sell a headset that game was never made.

The successor of half life probably requires one of those things while at it. Why? because they probably sell one of them themselves.

Or you know they could just make a game and not tag it towards plastic nobody cares about.

Then why continue to make their "plastic shit" if no one cares? Same shit they couldn't keep in stock for months. No one cares.

Also, you don't need their $1000 headset. It plays well on $300 headsets. So, what are you saying exactly?
 
Last edited:

Kataploom

Gold Member
FFS can't they just stop that shit?

I'm really feeling uncomfortable with how they're trying to get the most out of us... If only it was used solely for advertising but it's clear politics will use that data for their mass psychology tactics (we see that everyday with big data and we saw that with Cambridge fiasco).

Fuck that "reading people emotions in a no invasive way"... Whatever it means, as if the sole fact the they are constantly "reading my emotions" wasn't invasive enough.

I'll definitely skip VR as I do with those that require kernel level anti cheat systems.
 
FFS can't they just stop that shit?

I'm really feeling uncomfortable with how they're trying to get the most out of us... If only it was used solely for advertising but it's clear politics will use that data for their mass psychology tactics (we see that everyday with big data and we saw that with Cambridge fiasco).

Fuck that "reading people emotions in a no invasive way"... Whatever it means, as if the sole fact the they are constantly "reading my emotions" wasn't invasive enough.

I'll definitely skip VR as I do with those that require kernel level anti cheat systems.

I for one am glad that now the excuse is not that VR is a gimmick and has no games anymore.

It's that future VR is the Matrix so they can't play now to fund that.

that's sure progress and a sign that VR is going mainstream until it enslaves mankind
 

Valonquar

Member
I've watched enough Ghost in the Shell to know it's real bad idea to be an early adopter with brain to machine interfaces.
 

Kataploom

Gold Member
I for one am glad that now the excuse is not that VR is a gimmick and has no games anymore.

It's that future VR is the Matrix so they can't play now to fund that.

that's sure progress and a sign that VR is going mainstream until it enslaves mankind
Well, this was expected lmao
 
here lemme try "Boo Hoo Hoo , company i dont own, make games already! Do what lots of companies like ubisoft,EA and activision do..plunk out tons of uninnovative crap!"

Or...Valve please keep being the Willy Wonka of game Companies, do what you want, make what you want, have successes and failures that push games and technology forward..and every once in awhile make a game. valve is more then a game studio, has been for years, get over it.

This tech is awesome...is it input only though? Could you simulate touch and smell through it?
 

DrNeroCF

Member
I’m jealous of everyone who isn’t too cynical about the industry to think this will only be used to polish out any remaining quirks and personality from a game during Q&A and knowing the exact right moment to show you an ad or ask you to spend money.
 
It'll be used to sell ads to us better. Data collection! Ewgh, it's gonna just be used to tell Facebook what ads make me feel a certain way, and when I'm most pliable to react to them.

Sure it might help test games too but ad money is what it's for. And the implications for politics are concerning. I mean, you all now know that the big left, right division lately is from ad manipulation right?

And I love VR.
 
Top Bottom