With the recent appearance of ARCore (Google) and ARKit (Apple), Augmented Reality (AR) is suddenly being talked about all over the web. Ori Inbar is one of the smartest voices out there. In his role as the original founder of the New York Augmented Reality Meetup, Ori was a huge help to me and a ton of other people helping us get up to speed in this new world, and now he’s weighed in on the current situation in a must-read September 10th article. He points out that the excitement so far is mostly among developers. As a gee whiz! experience for ordinary early-adopters, these new apps are fun, but they don’t go far enough.
Some of the currently-available AR apps allow for precise placement of an image, while you’re standing there holding the phone in place–a teacup on the table, or even a hole in the table opening into a tunnel with a light at the end. The minute the device goes away (or is turned off), the image is gone. It isn’t available to be viewed by the next person who walks up. Such an image is interesting, maybe even amazing, so long as the maker is there. But it can’t stay around, once the person who made it has left. It can’t tell you anything about itself. It has precision, but not persistence.
Other apps may have persistence in the real world, but their location is general, and subject to the vagaries of GPS. GPS is great at helping us navigate our bodies and vehicles in space, where the general direction and location is good enough. But if we want to know exactly where, GPS isn’t very reliable. And for most potential uses of AR, “generally around here somewhere” isn’t good enough. To advertise my taco truck, an AR arrow noting “FOOD” needs to be right over my truck–not half a block away nearer my competition! And a heritage photo of my grandmother on the front porch of the family home–I want to be able to see Grandma on the front porch, not up on the roof, or on the neighbor’s lawn. Persistence without precision severely limits what can be communicated by AR placement.
To reach toward its full potential, Inbar and others have noted, AR needs to be able to be placed precisely in the real world, and it needs to persist in that place. Neither ARCore nor ARKit is currently able to combine precision and persistence. GPS is not specific enough in most situations, and computer-vision is a long way from being able to match the real world over time.
So the question seems to be, how can AR become the fully-developed technology enthusiasts foresee? And who will create that breakthrough?
Inbar asserts that “the AR Cloud” is the solution. Such a map would allow for placing AR precisely, in relation to points that map the real world, and because that map is in the cloud, anything tied to it would persist. But developing a point cloud of the entire world poses huge, if not insurmountable, technical problems, not to mention privacy and security concerns that could be equally problematic. (Inbar acknowledges these issues, and posits that the AR Cloud he envisions could be 3 or more years off.)
We would say in answer to the questions of who and how–Membit™, and its patented Human Positioning System (HPS). Membit bypasses the need to map the entire world by obviating the requirement that the computer do the work of finely matching AR to the real world. Instead, a human does the matching. No point cloud will ever be able to keep up with changes in the real world, be those changes routine, occasional, or catastrophic. But the human eye and brain are well-equipped to accommodate without confusion a tree growing taller (in Brooklyn or elsewhere), a building coming down, a picture being hung on a wall, or a food cart parked in a slightly different spot on the block.
Membit (available now for iOS) allows creation and viewing of precise AR placements anywhere, anytime, by anyone with the app. Membits are designated private, friends only, or public, and annotated with date and username of creation, as well as optional additional information, such as historical details, technical specifications, or personal sentiments.
At AWE 2017 (http://www.membit.co/2017/09/
All of these depend on the precision of HPS and on being held in the cloud so that the image persists and can be viewed anytime, by following the map on the app and visually lining up the image.
At Membit we are creating the ultimate tool for humans to tell other humans their story about a place.