Thursday, June 20, 2013
WARNING: This post deals with the technical implications of programming for Google Glass. If you are not familar with APIs, cloud technology, etc. you may want to leave. You've been warned :)
Google Glass could be a useful tool for firefighters. It can provide information in the form of text and pictures directly to the users field of vision. This is a rapid means of getting small amounts of information such as dispatch details, location information, or hazard alerts. I don't think Google Glass would be useful for interior firefighting(at least not in it's current form). However for a incident commander, safety officer, or other exterior role it could prove useful.
I have yet to receive a pair of Google Glass, but hopefully I will soon with help from some contributions (hint , hint. you can help by following this link). I do however understand how to program for it and can't wait to do so once I get a pair.
Currently there is only one 'official' way to make apps for Google Glass: the Mirror Api. Essentially Google hosts each Google Glass user's 'timeline' on there server. Cards can be inserted into the timeline using the mirror Api. The said cards will then appear on the users Glass and can be viewed, dismissed, etc. The mirror api is a RESTFUL api that is a standard in which modern web developers are very familiar.
My plan for using Glass in the fire service is to use Google's Appengine to host fire department's data in a cloud server. The data will be hosted in the datastore. When an incident is sent from the dispatch center the incident location and nature will be sent to the Google Glass. Location information for the address will also be sent to Google glass and immediately available to view.
Location information could also be accessed by selected a 'show close by information'. In this case I will be using appengine's Search Api to get the closest location information to the user and then sending the data to the Mirror Api.
Wednesday, April 17, 2013
Imagine being notified of a structure fire immediately on and being presented with the address and nature of the incident right in your vision. Then imagine getting details of the incident location such as type of occupancy floor plans, and any other available info all hands free. Then your safety officer records and streams video of the fire as he arrives on scene providing evidence for investigators and an extra set of eyes to the incident commander. This may be reality much sooner than you think.
Google Glass is here - at least to a few select "Glass Explorers". For those not familiar with Google's latest mobile device Google Glass is a pair of glasses with a display in users field of vision. There is also a camera and a "bone conduction transducer" that pumps sound directly into your head. One can imagine the types of apps that could be beneficial - navigation, schematics for works displayed so they never have to move hands, music for musicians. Glass will also be able to do Google+ hangouts which could be a killer app. Could this be then next big thing in mobile computing? Only time will tell at this point, but it is definitely interesting and I can't wait to try this in some fire department use cases.
Google held a selection process ( #ifihadglass) to find first users of Google Glass. Yours truly actually got picked! At least 2 other firefighters have been selected as well(+Jeff Miller +Max Wood ). I have yet to get a pair(and may not at $1500!), but I am going to start working on a Glass application for the fire service. Interested? Join my Google+ community. I'm looking for testers.
Glass is expected to be available to the public late this year and price is not known. The explorer edition is going for $1500 however the final product is most likely going to be much less.