I was invited back to MilCIS to present on the latest developments in Artificial Intelligence from Telstra Labs, covering video analytics and data driven applications.
Telstra also presented on new collaboration tools, 5G, and our Connected Supply Chain project.
Exciting new developments included a demonstration of retina-level VR headsets, the new HoloLens 2 and an Explosive Ordnance Disposal robot.
Edge Computing and Video Analytics
I was asked to present a range of exciting demonstrations reflecting how Telstra is leveraging artificial intelligence to analyse video, and how this technology may be used in our products. Examples included search and rescue, asset inspection, construction, smart cities and robotics.
I also showcased a live demonstration of object detection, using Yolo v3. This demo was particularly exciting, as the video was being broadcast over 4G in realtime, to a GPU laptop, acting as a "simulated edge computing server".
Typically, video is sent from a device (in this case, a bodyworn camera) to the cloud, where it is processed. By moving the computational hardware closer to the end user (at the edge), the data does not need to travel as far, reducing the latency of these applications.
Video analytics are also able to be deployed within the camera itself, using special optimized hardware.
It will be interesting to see in the future how applications will be deployed- in the cloud, at the distributed edge, or on the device itself!
Track and Monitor
Track and Monitor allows for enterprise assets to be tracked across Australia.
I showcased our LTE enabled solar tracking device, and a few of our bluetooth asset tracking tags.
Telstra has explored the application of artificial intelligence to our Track and Monitor product, and these applications were showcased to potential customers.
Connected Supply Chain
Telstra is investigating the benefits of a fully connected supply chain. This provides increased visibility and control to each member of the supply chain, and for more effective load balancing and optimisation.
5G and Cell on Wings
My colleague Peter Jones showcased the Cell on Wings, our Vapor 55 autonomous helicopter.
It carried to 4G small cells, however a few weeks prior Peter had also tested it carrying a P25 radio, to better support emergency responders in disaster recovery scenarios.
The helicopter operates tethered to the ground, providing both communications and power.
Microsoft Collaborative Tools
Telstra Purple, our professional services group, showcased a range of exciting new products from Microsoft.
The Surface Hub 2 was shown, and was an impressive upgrade from the Surface Hub 1 in terms of form factor and responsiveness.
I also had the opportunity to try the new HoloLens 2.
The new form factor was very impressive, with the battery at the back of the headset providing a much more comfortable weight distribution.
The Field of View was greatly improved over the first generation headset, and the hand tracking allowed for far more natural manipulation of data and holograms.
CISTECH are a system integrator of Radio over IP and network solutions.
They were showcasing a range of edge computing, voice and video transmission technologies.
I was lucky enough to have the opportunity to drive an Explosive Ordnance Disposal (EOD) robot.
I was able to pick up a water bottle with it, and pass it to myself, using only an xbox 360 controller as the interface.
I also had the chance to drive it around the floor, operating over the video link.
Bohemia Interactive Solutions
Bohemia Interactive were showcasing a range of military training and simulation experiences, through VR.
They were showcasing the Varjo VR headset, which demonstrates "human-eye" resolution.
I was truly blown away by the quality of the headset. Initial impressions were not that different from current generation headsets, until I noticed the "sweet spot" of ultra high, retina level resolution that occupies the centre.
The demonstration was a flight simulator, showing the cockpit of a plane. Typically in VR, the resolution is too low to properly read dials and gauges, or text.
The dials were perfectly clear, and the dreaded "screen door effect" of being able to see the individual pixels on the VR headset screen was completely gone.
The high resolution portion of the screen was an oval taking up the centre ~30% of my field of view. The rest of the screen was lower resolution, due to the limitations of the cables to carry the amount of data required for a full, retina level display.
The high resolution portion of the screen was actually a separate screen, which was motorized. Future versions of the headset will implement eye tracking, and the motorized screen will move to follow the wearer's eye!
Interestingly, this was built on the ARMA III game engine, as the company split from Bohemia Interactive, the games studio, to focus on military training.
Leidos showcased their modular seafloor mapping capability, to assist fleets navigating reefs and other marine hazards.
Another solution demonstrated was a telepresence tool, leveraging computer vision to assist with physical rehabilitation.
A custom implementation of OpenPose, developed between CSIRO and Coviu, allowed for 3 dimensional pose detection, using only an RGB camera, in a browser.
This allows physical therapists to have measurable and trackable metrics around the patients improvement.
Thales showcased a variety of technology, including a cyber ranging environment, a display to track squad members movements over a radio link, and a range of satellite receivers.
Realwear showcased their "see-what-I-see" augmented reality technology, in collaboration with Hewlett-Packard, to demonstrate hands free interfaces.
The video feed was sent to a range of edge computing boxes, that are able to be deployed on-site, to provide low latency AI applications.
The video was run through an object detection algorithm, allowing assets in the field to be automatically recognised.
Schematics were able to be pulled up through voice searches, and the headset was even able to zoom in on command.