seems the “Vision Pro” DOES have a virtual keyboard… So I’ll bet direct control of an external mac, with the VP doing the display(s) is a very real possiblity.
Typing without haptic feedback - sounds like we can then give Xojo another chance
What ever happened to Sixth Sense, which was an MIT Technology Lab project that was demo’d on TED in 2009?
It didn’t seem to need goofy glasses.
From what I understand, you can use it that way, or with the Mac’s keyboard and mouse, but it appears you are limited to a 4K screen only.
Yeah, that’s pretty much what I would like also, same problem. I use a Mac…
I read several accounts of the Vision Pro announcement and one of the statements was that “thousands of existing apps will work with Vision Pro as-is”. Of course at this point we don’t know what apps they are talking about, iOS or macOS, etc., or how leaky the abstraction is, but one thing I am pretty sure of is that certain ahem vendors who don’t keep up with OS best practices and evolution will have more difficulty than others.
From what I understand it’s iPad apps.
AFAICT Vision Pro REQUIRES Swift and SwiftUI
So the probably rules out our green friends
I could easily be wrong but I guess we’ll find out
Robin Roberts “Do you think this is something that the average person will be able to afford?”
Tim Cook “I don’t know.” ~ “but I think it’s a great value”.
Someone was asking about being able to project macOS into AppleVision. The answer is yes:
Of course you can also use a keyboard and trackpad, connected via Bluetooth, and you can also project a Mac into the Vision Pro … I didn’t get the chance to try the Mac projection, but truthfully, while I went into this keynote the most excited about this capability, the native interface worked so well that I suspect I am going to prefer using native apps, even if those apps are also available for the Mac.
I wonder what moving your head around does? Like does it pan?
I haven’t read about it yet though.
There’s a ridiculous number of cameras and sensors (including lidar) pointing at you and the environment, coordinated by a new Apple-designed realtime chip called the N1 (no published specs as yet) coupled with an M2. What they figured out from research is that response time has to be under 8 ms or people get vertigo from it to varying degrees.
I read a post by a guy who worked on the foundations of this project, no longer with Apple but free to speak in general terms about what he did, post-announcement. The most interesting thing he discovered, and is very proud of, is that there’s a completely reliable “tell” detectable in the wearer’s iris the predicts what you are going to do next. IOW the device knows exactly what control you are looking at and have decided basically before you are even aware to manipulate that control because the iris has some subtle cue that it is anticipating a response from your decided (but as yet unexecuted) action. Thus the system can get started on all the requisite calculations.
It’s an impressive piece of R&D, especially if it works in the real world.
The article I linked to says you put it on and at first you just see what you would if you didn’t have it on, it’s that good at giving you a panoramic view of the real world with only minor restriction around the edges of your vision. The part not in the demo was that it will also represents the covered part of your face to those looking at you, so you’re not completely isolated in either direction. That could be creepy / weird, depending on exactly how it works.
Someone asked Tim Cook (whose legacy project / mark on the company AppleVision is) if ordinary people would be able to afford it. He said, “I don’t know, but I think it’s a great value”. Very Apple/elite-ish sort of comment but you know over time it will get better and cheaper and while I cannot really imagine being fine with poking in thin air with zero tactile feedback, who knows, maybe it works better than it sounds.
Also specifically re: your question, projected macOS desktop(s) will just be positioned wherever you want them in a sort of cylindrical panorama around where you’re sitting. In one of the demo shots in the article the macOS app is to the user’s left, so you just look over at it as if it were a physical monitor left of center. If you turn your head then you can look directly at it.
my concern (and probably not a big deal, as I could never justify buying one), is I have no depth-perception and therefore cannot watch 3D-Movies (am ok in the real world). So I doubt I’d get the full (any) perception inside the goggles.
Woah, that’s wicked cool.
In addition to the $3500 price tag, if you use glasses you have to buy corrected lenses to insert into the device. Bausch & Lomb helpfully will make them for you – for a price.
This is definitely a device for people with more $ than sense. It reminds me a little bit of the “bag” phones you would use in your car in the early 1990s. For my particular situation at the time, $1000 a month in usage fees was worth it as I had to physically travel to a lot of clients in the region, and I could turn some of my drive time into billable hours. One time I stopped at a fast food place and the clerk at the window was like, “hey guys, look, this guy has a CAR PHONE” so you get that “wow” factor attention with everyone craning their neck out of the window to glimpse the new wonder tech – but for most people of that era, it was too much $ and screwing around compared to other options.
This will be like that I think … 10, 15 years from now though there might well be a more compelling use cases.
yeah… and I DO were glasses too …
I watched the Vision Pro segment of the keynote last night with my wife. And multiple times we both said, “We’d do that,” especially in the RV with limited space. But the one thing that’s missing, or at least not mentioned, is how we’d interact as a couple with Vision Pro.
I miss a big screen TV. I miss a big monitor. Vision Pro could potentially fix that. But what do we do as a couple when watching TV/movies? I doubt we’d spend that type of money to get two headsets. And frankly if we wear them all day long for work I doubt we’ll want to do that in our free time.
I don’t play any video games any more simply because I spend all day in front of a screen. Why would I want to spend even more time in front of a screen? I could see the headset even more so.
Yes that is probably v2 or v3 I suppose. Also if you wanted to go (relatively!) on the cheap and share a device and one or both of you wear glasses, how much of a PITA is it to switch out the add-on lenses (my guess: too much).