AR-integrated live video – Addressing the Requirements of Immersive Experiences with AWS

The following figure illustrates an architecture for an AR application that event attendees with mobile devices can use:

Figure 11.15 – Sporting event attendees viewing camera feeds from angles they can’t see

AWS Elemental Live offers a series of on-premises hardware appliances that can natively stream live video to other AWS-managed services in the Elemental family. These services live in-region and can take advantage of the global distribution capabilities of Amazon CloudFront and ML-powered dynamic ad insertion.

At the same time, these appliances can output video streams to other self-managed targets via the Zixi protocol. This is a proprietary UDP-based protocol that has its own error correction and congestion management elements that optimize it for lossless transport of live video streams. In this architecture, the live video streams are split via this mechanism and a 5G hotpot to a Zixi server running in AWS Wavelength.

From there, other instances run a Unity-based AR application that dynamically subscribes to specific camera feeds that the user requests. This is then pixel-streamed to the application with additional graphics and information overlaid. This allows someone watching the event in person to rotate between different camera feeds that have a view of the action that they currently do not. This is particularly desirable with events such as Formula 1 racing, where attendees only see the cars briefly from their vantage point.

Summary

In this chapter, we reviewed the technologies that underpin VR, AR, and MR use cases. They’re not just about gaming or entertainment anymore; these technologies, combined with platforms such as Unity, are opening up possibilities in many areas.

We covered how Unity works in the context of online gaming, including elements such as dedicated servers and different types of clients. We reviewed how pixel streaming with Unity through NICE-DCV or CloudXR from NVIDIA opens up VR-based gaming to wider audiences by providing access without the need for high-specification user devices.

Next, we discussed how connected workers can utilize MEC to access crucial data and communication tools via AR glasses, even in remote or hazardous locations. We reviewed how AR and VR have begun revolutionizing workforce training and development, offering simulated, risk-free environments where skills can be honed effectively. Finally, we dove into the application of AR in sporting events, which promises a redefined spectator experience, merging live action with a wealth of digital information and visuals, thus enhancing understanding, engagement, and entertainment value for fans both on-site and remotely.

In the next chapter, we will cover how to turn AWS Snowcone into an IoT gateway.

Leave a comment

Your email address will not be published. Required fields are marked *