How Google’s Project Glass glasses could foster the jobsite of the future

After more than a year and a half of standing by as Android tablets collected dust on store shelves, Google finally took action June 27 with the announcement of its own tablet, the Nexus 7, running the latest version of Android, 4.1 “Jelly Bean.”

That along with the fact that the diminutive tablet is a quad-core processor-packing power house that retails for only $199, is big news. But somehow Google found a way to upstage the Nexus 7 with a demonstration of another highly-anticipated and futuristic product: Project Glass.

For those unfamiliar with Project Glass, it’s Google’s aim to create the first pair of Internet-connected, smart glasses. Glass is the evolution of augmented reality apps, which overlay a user interface and other information onto a real-time image. That’s done on current smartphone apps though the phone’s camera lens. But rather than look at a phone screen, Glass projects its interface directly into your field of vision. Google has a concept video of a first-person experience that shows text messages popping up over a bowl of cereal as the user eats, and later walking navigation directions updating as he turns a corner.

Glass also features voice assistance like Apple’s Siri on the iPhone 4S. A guy in the video sees a poster for a concert and simply asks the glasses to remind him to buy tickets. It also depcits him seeing some wall art and taking a photo. A frame appears in his vision and the glasses snap the photo.

But these early demo videos were nothing compared to what Google showed off on June 27. Just after unveiling the Nexus 7 at its I/O developers conference, Google began broadcasting live video to those in attendance of a team of skydivers readying for a jump. Two of them were wearing Project Glass glasses and proceeded to jump from the plane, broadcasting their jump live thanks to the prototype glasses. You can watch the highlights of the demo below.

[youtube hxmbbtuRszA nolink]

It went off without a hitch. Obviously, this had everyone excited for a couple reasons. First, it proved that Google is serious about this. So serious that they upstaged their first official (and badly needed) tablet. Second, we know now that the Glass prototypes are actually capable of doing some of the cool things Google has waxed on about.

Partner Insights
Information to advance your business from industry suppliers
8 Crucial Elements of a Tire Safety Program
Presented by Michelin North America
Selecting the Correct Construction Tire Solution
Presented by Michelin North America
How High Fuel Prices hurt Your Business
Presented by EquipmentWatch

After seeing this demonstration, I began considering various applications for Project Glass when it becomes available to the general public in about a year’s time. Then it occurred to me how helpful these glasses would be on a construciton job site.

First, the very basis for the glasses, augmented reality, would allow guys on the job site to view blueprints by simply asking aloud for them. In a few seconds, the plans are directly in sight. And what’s better, both 3D models and blueprints could be projected into your vision, allowing you to see outlines and a finsihed product on top of the site’s current state.

Next, the glasses are equipped with GPS. In the first demonstrations we’ve seen of glass, this has been used to provide turn-by-turn directions and the ability to share your location with friends. But theoretically it could be used for reference while grading or paving. Rather than checking a monitor inside the cab, the operator would be able to direct the equipment along lines projected right into his field of vision.

The feature demoed the most at this point has been Project Glass’ built-in camera. Having a camera that is literally in line with what you are viewing would enable those on the job site to send still images as well as live, streaming video to colleagues both on and off the site. If you need someone to take a look at or discuss a piece of the project, Glass would allow you to show that person instantly what you’re looking at. Want to compare that image to the plans? It’s all right there. Pull them up and compare to the image you just shot.

Finally, at the very least, Project Glass would allow people on the worksite to instantly communicate with one another, hands-free. Because Glass supports a Siri-like voice assistant, the glasses take voice commands and have the ability to transcribe an email or text message or make a phone call.

What could be really exciting is if a developer combines all of this functionality into a job site management application for the glasses. It’s not crazy to think of a scenario where an app allows those in the field to have a complete understanding of what’s going on at their job site at all times. That includes hands-free access to a map of the site, the location of their colleages, diagrams of the site over time to track progress, quickly accessible plans and the ability to communicate with any person involved with the project instantly, no matter where they are.

Google has stated that Project Glass glasses will be very light and able to fit over other eyewear. That means guys on the job site can wear Glass over safety eyewear.

Google is planning to release a very early version of Project Glass glasses to software developers in January 2013 at a cost of $1,500 unit. That is a whopping price, but Google executives have stated that the final version of Project Glass will cost “significatnly less” than the “Explorer Edition” being released to those developers early next year.

They really need to get within the $300 range for this to take off both in the consumer and commercial markets. In the meantime, it’s fun to dream about the possibilites Project Glass holds.